[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 7557 1726882073.71962: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 7557 1726882073.72247: Added group all to inventory 7557 1726882073.72249: Added group ungrouped to inventory 7557 1726882073.72251: Group all now contains ungrouped 7557 1726882073.72253: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 7557 1726882073.81662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 7557 1726882073.81705: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 7557 1726882073.81722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 7557 1726882073.81763: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 7557 1726882073.81812: Loaded config def from plugin (inventory/script) 7557 1726882073.81814: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 7557 1726882073.81843: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 7557 1726882073.81903: Loaded config def from plugin (inventory/yaml) 7557 1726882073.81904: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 7557 1726882073.81964: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 7557 1726882073.82237: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 7557 1726882073.82239: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 7557 1726882073.82242: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 7557 1726882073.82246: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 7557 1726882073.82249: Loading data from /tmp/network-Kc3/inventory.yml 7557 1726882073.82296: /tmp/network-Kc3/inventory.yml was not parsable by auto 7557 1726882073.82338: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 7557 1726882073.82365: Loading data from /tmp/network-Kc3/inventory.yml 7557 1726882073.82437: group all already in inventory 7557 1726882073.82446: set inventory_file for managed_node1 7557 1726882073.82450: set inventory_dir for managed_node1 7557 1726882073.82451: Added host managed_node1 to inventory 7557 1726882073.82453: Added host managed_node1 to group all 7557 1726882073.82454: set ansible_host for managed_node1 7557 1726882073.82455: set ansible_ssh_extra_args for managed_node1 7557 1726882073.82458: set inventory_file for managed_node2 7557 1726882073.82460: set inventory_dir for managed_node2 7557 1726882073.82461: Added host managed_node2 to inventory 7557 1726882073.82462: Added host managed_node2 to group all 7557 1726882073.82463: set ansible_host for managed_node2 7557 1726882073.82463: set ansible_ssh_extra_args for managed_node2 7557 1726882073.82466: set inventory_file for managed_node3 7557 1726882073.82468: set inventory_dir for managed_node3 7557 1726882073.82468: Added host managed_node3 to inventory 7557 1726882073.82469: Added host managed_node3 to group all 7557 1726882073.82470: set ansible_host for managed_node3 7557 1726882073.82471: set ansible_ssh_extra_args for managed_node3 7557 1726882073.82473: Reconcile groups and hosts in inventory. 7557 1726882073.82476: Group ungrouped now contains managed_node1 7557 1726882073.82478: Group ungrouped now contains managed_node2 7557 1726882073.82480: Group ungrouped now contains managed_node3 7557 1726882073.82552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 7557 1726882073.82669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 7557 1726882073.82715: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 7557 1726882073.82741: Loaded config def from plugin (vars/host_group_vars) 7557 1726882073.82744: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 7557 1726882073.82750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 7557 1726882073.82758: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 7557 1726882073.82799: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 7557 1726882073.83116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882073.83197: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 7557 1726882073.83234: Loaded config def from plugin (connection/local) 7557 1726882073.83237: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 7557 1726882073.83863: Loaded config def from plugin (connection/paramiko_ssh) 7557 1726882073.83867: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 7557 1726882073.84737: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7557 1726882073.84777: Loaded config def from plugin (connection/psrp) 7557 1726882073.84780: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 7557 1726882073.85533: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7557 1726882073.85572: Loaded config def from plugin (connection/ssh) 7557 1726882073.85575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 7557 1726882073.87444: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7557 1726882073.87481: Loaded config def from plugin (connection/winrm) 7557 1726882073.87484: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 7557 1726882073.87515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 7557 1726882073.87573: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 7557 1726882073.87634: Loaded config def from plugin (shell/cmd) 7557 1726882073.87636: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 7557 1726882073.87659: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 7557 1726882073.87725: Loaded config def from plugin (shell/powershell) 7557 1726882073.87727: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 7557 1726882073.87780: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 7557 1726882073.87958: Loaded config def from plugin (shell/sh) 7557 1726882073.87960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 7557 1726882073.87996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 7557 1726882073.88112: Loaded config def from plugin (become/runas) 7557 1726882073.88115: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 7557 1726882073.88290: Loaded config def from plugin (become/su) 7557 1726882073.88292: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 7557 1726882073.88447: Loaded config def from plugin (become/sudo) 7557 1726882073.88449: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 7557 1726882073.88481: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7557 1726882073.88802: in VariableManager get_vars() 7557 1726882073.88826: done with get_vars() 7557 1726882073.88959: trying /usr/local/lib/python3.12/site-packages/ansible/modules 7557 1726882073.91935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 7557 1726882073.92012: in VariableManager get_vars() 7557 1726882073.92017: done with get_vars() 7557 1726882073.92019: variable 'playbook_dir' from source: magic vars 7557 1726882073.92019: variable 'ansible_playbook_python' from source: magic vars 7557 1726882073.92020: variable 'ansible_config_file' from source: magic vars 7557 1726882073.92020: variable 'groups' from source: magic vars 7557 1726882073.92021: variable 'omit' from source: magic vars 7557 1726882073.92021: variable 'ansible_version' from source: magic vars 7557 1726882073.92022: variable 'ansible_check_mode' from source: magic vars 7557 1726882073.92022: variable 'ansible_diff_mode' from source: magic vars 7557 1726882073.92022: variable 'ansible_forks' from source: magic vars 7557 1726882073.92023: variable 'ansible_inventory_sources' from source: magic vars 7557 1726882073.92023: variable 'ansible_skip_tags' from source: magic vars 7557 1726882073.92024: variable 'ansible_limit' from source: magic vars 7557 1726882073.92024: variable 'ansible_run_tags' from source: magic vars 7557 1726882073.92024: variable 'ansible_verbosity' from source: magic vars 7557 1726882073.92062: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml 7557 1726882073.92743: in VariableManager get_vars() 7557 1726882073.92759: done with get_vars() 7557 1726882073.92798: in VariableManager get_vars() 7557 1726882073.92811: done with get_vars() 7557 1726882073.92847: in VariableManager get_vars() 7557 1726882073.92859: done with get_vars() 7557 1726882073.93008: in VariableManager get_vars() 7557 1726882073.93021: done with get_vars() 7557 1726882073.93026: variable 'omit' from source: magic vars 7557 1726882073.93045: variable 'omit' from source: magic vars 7557 1726882073.93079: in VariableManager get_vars() 7557 1726882073.93096: done with get_vars() 7557 1726882073.93147: in VariableManager get_vars() 7557 1726882073.93156: done with get_vars() 7557 1726882073.93203: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7557 1726882073.93342: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7557 1726882073.93425: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7557 1726882073.93802: in VariableManager get_vars() 7557 1726882073.93815: done with get_vars() 7557 1726882073.94109: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 7557 1726882073.94197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882073.95922: in VariableManager get_vars() 7557 1726882073.95941: done with get_vars() 7557 1726882073.95980: in VariableManager get_vars() 7557 1726882073.96221: done with get_vars() 7557 1726882073.96698: in VariableManager get_vars() 7557 1726882073.96718: done with get_vars() 7557 1726882073.96724: variable 'omit' from source: magic vars 7557 1726882073.96735: variable 'omit' from source: magic vars 7557 1726882073.96767: in VariableManager get_vars() 7557 1726882073.96786: done with get_vars() 7557 1726882073.96811: in VariableManager get_vars() 7557 1726882073.96828: done with get_vars() 7557 1726882073.96855: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7557 1726882073.96964: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7557 1726882073.97044: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7557 1726882073.99446: in VariableManager get_vars() 7557 1726882073.99471: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882074.01385: in VariableManager get_vars() 7557 1726882074.01412: done with get_vars() 7557 1726882074.01519: in VariableManager get_vars() 7557 1726882074.01538: done with get_vars() 7557 1726882074.01595: in VariableManager get_vars() 7557 1726882074.01613: done with get_vars() 7557 1726882074.01619: variable 'omit' from source: magic vars 7557 1726882074.01630: variable 'omit' from source: magic vars 7557 1726882074.01659: in VariableManager get_vars() 7557 1726882074.01674: done with get_vars() 7557 1726882074.01697: in VariableManager get_vars() 7557 1726882074.01714: done with get_vars() 7557 1726882074.01742: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7557 1726882074.01856: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7557 1726882074.01935: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7557 1726882074.02337: in VariableManager get_vars() 7557 1726882074.02358: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882074.05928: in VariableManager get_vars() 7557 1726882074.05952: done with get_vars() 7557 1726882074.05991: in VariableManager get_vars() 7557 1726882074.06013: done with get_vars() 7557 1726882074.07002: in VariableManager get_vars() 7557 1726882074.07040: done with get_vars() 7557 1726882074.07045: variable 'omit' from source: magic vars 7557 1726882074.07055: variable 'omit' from source: magic vars 7557 1726882074.07086: in VariableManager get_vars() 7557 1726882074.07109: done with get_vars() 7557 1726882074.07129: in VariableManager get_vars() 7557 1726882074.07148: done with get_vars() 7557 1726882074.07174: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7557 1726882074.07284: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7557 1726882074.07570: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7557 1726882074.08362: in VariableManager get_vars() 7557 1726882074.08387: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882074.12187: in VariableManager get_vars() 7557 1726882074.12222: done with get_vars() 7557 1726882074.12258: in VariableManager get_vars() 7557 1726882074.12279: done with get_vars() 7557 1726882074.12337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 7557 1726882074.12349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 7557 1726882074.12613: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 7557 1726882074.12768: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 7557 1726882074.12771: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 7557 1726882074.12807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 7557 1726882074.12832: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 7557 1726882074.13075: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 7557 1726882074.13167: Loaded config def from plugin (callback/default) 7557 1726882074.13170: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7557 1726882074.14427: Loaded config def from plugin (callback/junit) 7557 1726882074.14430: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7557 1726882074.14476: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 7557 1726882074.14547: Loaded config def from plugin (callback/minimal) 7557 1726882074.14550: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7557 1726882074.14591: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7557 1726882074.14654: Loaded config def from plugin (callback/tree) 7557 1726882074.14656: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 7557 1726882074.14780: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 7557 1726882074.14783: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_auto_gateway_nm.yml ******************************************** 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7557 1726882074.14817: in VariableManager get_vars() 7557 1726882074.14831: done with get_vars() 7557 1726882074.14837: in VariableManager get_vars() 7557 1726882074.14846: done with get_vars() 7557 1726882074.14850: variable 'omit' from source: magic vars 7557 1726882074.14886: in VariableManager get_vars() 7557 1726882074.14904: done with get_vars() 7557 1726882074.14924: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_auto_gateway.yml' with nm as provider] ***** 7557 1726882074.15899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 7557 1726882074.15983: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 7557 1726882074.16106: getting the remaining hosts for this loop 7557 1726882074.16108: done getting the remaining hosts for this loop 7557 1726882074.16111: getting the next task for host managed_node3 7557 1726882074.16115: done getting next task for host managed_node3 7557 1726882074.16116: ^ task is: TASK: Gathering Facts 7557 1726882074.16118: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882074.16120: getting variables 7557 1726882074.16121: in VariableManager get_vars() 7557 1726882074.16130: Calling all_inventory to load vars for managed_node3 7557 1726882074.16132: Calling groups_inventory to load vars for managed_node3 7557 1726882074.16135: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882074.16146: Calling all_plugins_play to load vars for managed_node3 7557 1726882074.16155: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882074.16158: Calling groups_plugins_play to load vars for managed_node3 7557 1726882074.16197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882074.16253: done with get_vars() 7557 1726882074.16259: done getting variables 7557 1726882074.16554: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 Friday 20 September 2024 21:27:54 -0400 (0:00:00.018) 0:00:00.018 ****** 7557 1726882074.16581: entering _queue_task() for managed_node3/gather_facts 7557 1726882074.16583: Creating lock for gather_facts 7557 1726882074.17191: worker is 1 (out of 1 available) 7557 1726882074.17207: exiting _queue_task() for managed_node3/gather_facts 7557 1726882074.17226: done queuing things up, now waiting for results queue to drain 7557 1726882074.17228: waiting for pending results... 7557 1726882074.17378: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7557 1726882074.17441: in run() - task 12673a56-9f93-ed48-b3a5-000000000155 7557 1726882074.17448: variable 'ansible_search_path' from source: unknown 7557 1726882074.17477: calling self._execute() 7557 1726882074.17524: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882074.17530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882074.17540: variable 'omit' from source: magic vars 7557 1726882074.17610: variable 'omit' from source: magic vars 7557 1726882074.17630: variable 'omit' from source: magic vars 7557 1726882074.17659: variable 'omit' from source: magic vars 7557 1726882074.17692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882074.17719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882074.17734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882074.17747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882074.17759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882074.17798: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882074.17801: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882074.17804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882074.17859: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882074.17866: Set connection var ansible_shell_executable to /bin/sh 7557 1726882074.17870: Set connection var ansible_shell_type to sh 7557 1726882074.17873: Set connection var ansible_pipelining to False 7557 1726882074.17875: Set connection var ansible_connection to ssh 7557 1726882074.17885: Set connection var ansible_timeout to 10 7557 1726882074.17900: variable 'ansible_shell_executable' from source: unknown 7557 1726882074.17903: variable 'ansible_connection' from source: unknown 7557 1726882074.17905: variable 'ansible_module_compression' from source: unknown 7557 1726882074.17908: variable 'ansible_shell_type' from source: unknown 7557 1726882074.17910: variable 'ansible_shell_executable' from source: unknown 7557 1726882074.17913: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882074.17915: variable 'ansible_pipelining' from source: unknown 7557 1726882074.17917: variable 'ansible_timeout' from source: unknown 7557 1726882074.17922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882074.18066: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882074.18074: variable 'omit' from source: magic vars 7557 1726882074.18079: starting attempt loop 7557 1726882074.18082: running the handler 7557 1726882074.18096: variable 'ansible_facts' from source: unknown 7557 1726882074.18113: _low_level_execute_command(): starting 7557 1726882074.18120: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882074.18596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882074.18625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882074.18629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882074.18631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882074.18702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882074.18709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882074.18779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882074.20447: stdout chunk (state=3): >>>/root <<< 7557 1726882074.20582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882074.20586: stdout chunk (state=3): >>><<< 7557 1726882074.20588: stderr chunk (state=3): >>><<< 7557 1726882074.20616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882074.20700: _low_level_execute_command(): starting 7557 1726882074.20704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587 `" && echo ansible-tmp-1726882074.2062354-7579-101367005246587="` echo /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587 `" ) && sleep 0' 7557 1726882074.21263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882074.21277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882074.21298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882074.21340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882074.21367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882074.21381: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882074.21510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882074.21533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882074.21554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882074.21733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882074.23587: stdout chunk (state=3): >>>ansible-tmp-1726882074.2062354-7579-101367005246587=/root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587 <<< 7557 1726882074.23729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882074.23740: stdout chunk (state=3): >>><<< 7557 1726882074.23751: stderr chunk (state=3): >>><<< 7557 1726882074.23777: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882074.2062354-7579-101367005246587=/root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882074.23828: variable 'ansible_module_compression' from source: unknown 7557 1726882074.23886: ANSIBALLZ: Using generic lock for ansible.legacy.setup 7557 1726882074.23900: ANSIBALLZ: Acquiring lock 7557 1726882074.23908: ANSIBALLZ: Lock acquired: 140194287013904 7557 1726882074.23919: ANSIBALLZ: Creating module 7557 1726882074.61500: ANSIBALLZ: Writing module into payload 7557 1726882074.61699: ANSIBALLZ: Writing module 7557 1726882074.61703: ANSIBALLZ: Renaming module 7557 1726882074.61705: ANSIBALLZ: Done creating module 7557 1726882074.61732: variable 'ansible_facts' from source: unknown 7557 1726882074.61744: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882074.61762: _low_level_execute_command(): starting 7557 1726882074.61772: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 7557 1726882074.62820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882074.62886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882074.62906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882074.62998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882074.64646: stdout chunk (state=3): >>>PLATFORM <<< 7557 1726882074.64724: stdout chunk (state=3): >>>Linux <<< 7557 1726882074.64749: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 7557 1726882074.65118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882074.65122: stdout chunk (state=3): >>><<< 7557 1726882074.65124: stderr chunk (state=3): >>><<< 7557 1726882074.65127: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882074.65131 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 7557 1726882074.65151: _low_level_execute_command(): starting 7557 1726882074.65161: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 7557 1726882074.65458: Sending initial data 7557 1726882074.65468: Sent initial data (1181 bytes) 7557 1726882074.66508: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882074.66533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882074.66546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882074.66708: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882074.66750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882074.66779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882074.66875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882074.70229: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 7557 1726882074.70626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882074.70643: stdout chunk (state=3): >>><<< 7557 1726882074.70656: stderr chunk (state=3): >>><<< 7557 1726882074.70675: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882074.70770: variable 'ansible_facts' from source: unknown 7557 1726882074.70779: variable 'ansible_facts' from source: unknown 7557 1726882074.70829: variable 'ansible_module_compression' from source: unknown 7557 1726882074.70847: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7557 1726882074.70881: variable 'ansible_facts' from source: unknown 7557 1726882074.71086: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/AnsiballZ_setup.py 7557 1726882074.71343: Sending initial data 7557 1726882074.71346: Sent initial data (152 bytes) 7557 1726882074.73613: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882074.73907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882074.74031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882074.74287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882074.75796: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882074.75832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882074.75877: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp0vq43_db /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/AnsiballZ_setup.py <<< 7557 1726882074.75880: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/AnsiballZ_setup.py" <<< 7557 1726882074.75950: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp0vq43_db" to remote "/root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/AnsiballZ_setup.py" <<< 7557 1726882074.78874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882074.78878: stdout chunk (state=3): >>><<< 7557 1726882074.78882: stderr chunk (state=3): >>><<< 7557 1726882074.78884: done transferring module to remote 7557 1726882074.78886: _low_level_execute_command(): starting 7557 1726882074.78888: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/ /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/AnsiballZ_setup.py && sleep 0' 7557 1726882074.80081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882074.80086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882074.80092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882074.80211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882074.80220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882074.80242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882074.80276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882074.80316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882074.82195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882074.82400: stdout chunk (state=3): >>><<< 7557 1726882074.82403: stderr chunk (state=3): >>><<< 7557 1726882074.82406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882074.82408: _low_level_execute_command(): starting 7557 1726882074.82410: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/AnsiballZ_setup.py && sleep 0' 7557 1726882074.83464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882074.83480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882074.83609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882074.83673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882074.83797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882074.83838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882074.83953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882074.86099: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 7557 1726882074.86174: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 7557 1726882074.86288: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 7557 1726882074.86406: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 7557 1726882074.86440: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 7557 1726882074.86508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1db684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1db37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 7557 1726882074.86525: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1db6aa50> import '_signal' # <<< 7557 1726882074.86615: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 7557 1726882074.86707: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7557 1726882074.86738: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 7557 1726882074.86795: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 7557 1726882074.86812: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 7557 1726882074.86861: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 7557 1726882074.86873: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d93d130> <<< 7557 1726882074.86934: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882074.86979: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d93dfa0> import 'site' # <<< 7557 1726882074.86982: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7557 1726882074.87350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 7557 1726882074.87390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7557 1726882074.87395: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 7557 1726882074.87436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 7557 1726882074.87623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7557 1726882074.87626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d97bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 7557 1726882074.87638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d97bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7557 1726882074.87845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9b37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9b3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d993aa0> import '_functools' # <<< 7557 1726882074.87850: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9911c0> <<< 7557 1726882074.87933: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d978f80> <<< 7557 1726882074.87975: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7557 1726882074.87989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 7557 1726882074.88024: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 7557 1726882074.88074: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7557 1726882074.88175: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9d3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9d2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 7557 1726882074.88195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d992090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9d0b90> <<< 7557 1726882074.88219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da08740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d978200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 7557 1726882074.88285: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.88289: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da08bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da08aa0> <<< 7557 1726882074.88347: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da08e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d976d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882074.88399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 7557 1726882074.88503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da09580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da09250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 7557 1726882074.88566: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da0a480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 7557 1726882074.88621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da20680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.88670: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da21d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 7557 1726882074.88673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da22c00> <<< 7557 1726882074.88762: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da23260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da22150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 7557 1726882074.88790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 7557 1726882074.88809: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da23ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da23410> <<< 7557 1726882074.88851: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da0a4b0> <<< 7557 1726882074.88925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 7557 1726882074.88969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 7557 1726882074.88973: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.89008: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d717bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 7557 1726882074.89012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d7406e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d740440> <<< 7557 1726882074.89088: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d740710> <<< 7557 1726882074.89091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 7557 1726882074.89096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 7557 1726882074.89142: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.89273: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d741040> <<< 7557 1726882074.89373: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.89399: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d741a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7408f0> <<< 7557 1726882074.89431: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d715d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7557 1726882074.89458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 7557 1726882074.89473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 7557 1726882074.89524: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d742de0> <<< 7557 1726882074.89527: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d741b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da0aba0> <<< 7557 1726882074.89549: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7557 1726882074.89627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882074.89630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 7557 1726882074.89662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 7557 1726882074.89683: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d76f140> <<< 7557 1726882074.89745: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 7557 1726882074.89754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882074.89786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 7557 1726882074.89789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7557 1726882074.89844: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d78f500> <<< 7557 1726882074.89847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 7557 1726882074.89891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7557 1726882074.89943: stdout chunk (state=3): >>>import 'ntpath' # <<< 7557 1726882074.89977: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7f02c0> <<< 7557 1726882074.90011: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 7557 1726882074.90014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 7557 1726882074.90045: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 7557 1726882074.90074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7557 1726882074.90165: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7f2a20> <<< 7557 1726882074.90229: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7f03e0> <<< 7557 1726882074.90278: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7b92e0> <<< 7557 1726882074.90314: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1253d0> <<< 7557 1726882074.90324: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d78e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d743d10> <<< 7557 1726882074.90504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 7557 1726882074.90514: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b1d78e900> <<< 7557 1726882074.90836: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_9dlkelwf/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 7557 1726882074.90951: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.90983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7557 1726882074.91029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7557 1726882074.91101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 7557 1726882074.91160: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d18b0b0> import '_typing' # <<< 7557 1726882074.91375: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d169fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d169130> # zipimport: zlib available import 'ansible' # <<< 7557 1726882074.91415: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 7557 1726882074.92806: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.93955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d188f80> <<< 7557 1726882074.93959: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 7557 1726882074.93961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882074.93964: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 7557 1726882074.93996: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 7557 1726882074.94010: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1ba960> <<< 7557 1726882074.94046: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1ba6f0> <<< 7557 1726882074.94096: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1ba000> <<< 7557 1726882074.94100: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7557 1726882074.94137: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1baa50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d18bd40> import 'atexit' # <<< 7557 1726882074.94183: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1bb6b0> <<< 7557 1726882074.94218: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1bb8f0> <<< 7557 1726882074.94222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 7557 1726882074.94278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 7557 1726882074.94292: stdout chunk (state=3): >>>import '_locale' # <<< 7557 1726882074.94322: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1bbe30> <<< 7557 1726882074.94346: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 7557 1726882074.94374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 7557 1726882074.94413: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d025be0> <<< 7557 1726882074.94445: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.94467: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d027800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 7557 1726882074.94478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 7557 1726882074.94529: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d0281d0> <<< 7557 1726882074.94536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 7557 1726882074.94553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7557 1726882074.94586: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d029370> <<< 7557 1726882074.94590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 7557 1726882074.94651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7557 1726882074.94654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 7557 1726882074.94664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7557 1726882074.94745: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d02be30> <<< 7557 1726882074.94764: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1bbef0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d02a0f0> <<< 7557 1726882074.94811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 7557 1726882074.94856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7557 1726882074.95065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d033ce0> <<< 7557 1726882074.95070: stdout chunk (state=3): >>>import '_tokenize' # <<< 7557 1726882074.95082: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d0327b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d032510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 7557 1726882074.95115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 7557 1726882074.95317: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d032a80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d02a600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d077fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d078110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7557 1726882074.95344: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.95360: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d079bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d079970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7557 1726882074.95391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7557 1726882074.95440: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d07c110> <<< 7557 1726882074.95459: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d07a2a0> <<< 7557 1726882074.95528: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 7557 1726882074.95534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882074.95536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 7557 1726882074.95545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 7557 1726882074.95585: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d07f860> <<< 7557 1726882074.95711: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d07c230> <<< 7557 1726882074.95770: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d0806b0> <<< 7557 1726882074.95919: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.95932: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d080890> <<< 7557 1726882074.95936: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d080a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d078320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 7557 1726882074.95951: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.95975: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf0c0b0> <<< 7557 1726882074.96126: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.96132: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf0d3a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d082870> <<< 7557 1726882074.96207: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d083c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d082510> # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882074.96239: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 7557 1726882074.96418: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.96458: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 7557 1726882074.96562: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.96675: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.97214: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.97739: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 7557 1726882074.97769: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 7557 1726882074.97792: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882074.97832: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882074.97861: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf11580> <<< 7557 1726882074.97928: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 7557 1726882074.97951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf122a0> <<< 7557 1726882074.97968: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7419d0> <<< 7557 1726882074.98007: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 7557 1726882074.98037: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.98048: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 7557 1726882074.98202: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.98357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 7557 1726882074.98368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf12180> # zipimport: zlib available <<< 7557 1726882074.98815: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.99329: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882074.99403: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 7557 1726882074.99428: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882074.99564: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882074.99836: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 7557 1726882074.99966: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.00198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7557 1726882075.00257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 7557 1726882075.00287: stdout chunk (state=3): >>>import '_ast' # <<< 7557 1726882075.00338: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf13440> # zipimport: zlib available <<< 7557 1726882075.00410: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.00491: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 7557 1726882075.00611: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.00614: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 7557 1726882075.00682: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.00704: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.00773: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.00821: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 7557 1726882075.00852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.00933: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf1dfa0> <<< 7557 1726882075.00966: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf196d0> <<< 7557 1726882075.01013: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 7557 1726882075.01016: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 7557 1726882075.01070: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01133: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01159: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01208: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 7557 1726882075.01246: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 7557 1726882075.01262: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 7557 1726882075.01328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 7557 1726882075.01353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 7557 1726882075.01369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 7557 1726882075.01406: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d006ab0> <<< 7557 1726882075.01455: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d0fe780> <<< 7557 1726882075.01544: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf1e2a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf1e060> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 7557 1726882075.01573: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01607: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 7557 1726882075.01619: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 7557 1726882075.01686: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 7557 1726882075.01690: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01707: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 7557 1726882075.01758: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01826: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01849: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882075.01896: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01935: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.01970: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 7557 1726882075.02038: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02096: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02163: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02177: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02212: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 7557 1726882075.02396: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02557: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02604: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.02659: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.02688: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 7557 1726882075.02707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 7557 1726882075.02756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 7557 1726882075.02759: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb20c0> <<< 7557 1726882075.02786: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 7557 1726882075.02809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 7557 1726882075.02853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 7557 1726882075.02882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 7557 1726882075.02898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb3ffb0> <<< 7557 1726882075.02938: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.02941: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cb444a0> <<< 7557 1726882075.02994: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf9b200> <<< 7557 1726882075.03009: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb2c60> <<< 7557 1726882075.03035: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb0770> <<< 7557 1726882075.03062: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb0b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 7557 1726882075.03116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 7557 1726882075.03130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 7557 1726882075.03155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 7557 1726882075.03201: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cb47350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb46c00> <<< 7557 1726882075.03266: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cb46de0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb46030> <<< 7557 1726882075.03284: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 7557 1726882075.03457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb47530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cbaa060> <<< 7557 1726882075.03484: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb47f80> <<< 7557 1726882075.03543: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb11c0> <<< 7557 1726882075.03626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 7557 1726882075.03659: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 7557 1726882075.03662: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.03689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 7557 1726882075.03812: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.03816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 7557 1726882075.03896: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 7557 1726882075.03908: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 7557 1726882075.03942: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.04004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 7557 1726882075.04090: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 7557 1726882075.04156: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.04347: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.04350: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # <<< 7557 1726882075.04374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 7557 1726882075.04810: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05242: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 7557 1726882075.05301: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05352: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05396: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05425: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 7557 1726882075.05448: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 7557 1726882075.05544: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 7557 1726882075.05625: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05650: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 7557 1726882075.05706: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 7557 1726882075.05819: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.05907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 7557 1726882075.05938: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbaa2a0> <<< 7557 1726882075.05959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 7557 1726882075.05994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 7557 1726882075.06125: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbaaea0> <<< 7557 1726882075.06128: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 7557 1726882075.06249: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.06291: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 7557 1726882075.06338: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.06439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 7557 1726882075.06672: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 7557 1726882075.06700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 7557 1726882075.06766: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.06819: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.06836: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cbe63f0> <<< 7557 1726882075.07020: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbd60c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 7557 1726882075.07069: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 7557 1726882075.07211: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07235: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07309: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07408: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07560: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 7557 1726882075.07573: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 7557 1726882075.07600: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 7557 1726882075.07660: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07682: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 7557 1726882075.07777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 7557 1726882075.07803: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cbf9b50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbe5be0> import 'ansible.module_utils.facts.system.user' # <<< 7557 1726882075.07829: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 7557 1726882075.07882: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.07932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 7557 1726882075.07935: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.08072: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.08222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 7557 1726882075.08330: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.08428: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.08472: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.08536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 7557 1726882075.08540: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 7557 1726882075.08574: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882075.08577: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.08707: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.08850: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 7557 1726882075.08981: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.09122: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 7557 1726882075.09126: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.09145: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.09169: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.09782: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.10456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 7557 1726882075.10459: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # <<< 7557 1726882075.10461: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.10540: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.10635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 7557 1726882075.10648: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.10802: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.10998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 7557 1726882075.11001: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 7557 1726882075.11003: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11031: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 7557 1726882075.11085: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11176: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11275: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11471: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 7557 1726882075.11691: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11725: stdout chunk (state=3): >>># zipimport: zlib available<<< 7557 1726882075.11734: stdout chunk (state=3): >>> <<< 7557 1726882075.11775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 7557 1726882075.11791: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882075.11820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 7557 1726882075.11828: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11899: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.11966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 7557 1726882075.11987: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12020: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 7557 1726882075.12279: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 7557 1726882075.12285: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12288: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 7557 1726882075.12290: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12599: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 7557 1726882075.12843: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12858: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 7557 1726882075.12948: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.12987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 7557 1726882075.13015: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882075.13063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 7557 1726882075.13086: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.13175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 7557 1726882075.13212: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.13283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 7557 1726882075.13424: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882075.13428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 7557 1726882075.13447: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882075.13470: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.13512: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.13563: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.13724: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 7557 1726882075.13898: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.13901: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 7557 1726882075.13903: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.14049: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.14292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 7557 1726882075.14314: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.14344: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.14389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 7557 1726882075.14417: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.14499: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.14571: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 7557 1726882075.14660: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.14750: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 7557 1726882075.14830: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.15036: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 7557 1726882075.15057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 7557 1726882075.15097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 7557 1726882075.15101: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.15110: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1c9f7350> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1c9f7080> <<< 7557 1726882075.15148: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1c9f4230> <<< 7557 1726882075.26121: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1c9f7d10> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1ca3c560> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.26126: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 7557 1726882075.26128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1ca3dd30> <<< 7557 1726882075.26131: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1ca3d820> <<< 7557 1726882075.26619: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 7557 1726882075.50688: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.0068359375, "5m": 0.1484375, "15m": 0.0927734375}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3037, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 494, "free": 3037}, "nocache": {"free": 3311, "used": 220}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 382, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815549952, "block_size": 4096, "block_total": 65519099, "block_available": 63919812, "block_used": 1599287, "inode_total": 131070960, "inode_available": 131029184, "inode_used": 41776, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_<<< 7557 1726882075.50724: stdout chunk (state=3): >>>scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "27", "second": "55", "epoch": "1726882075", "epoch_int": "1726882075", "date": "2024-09-20", "time": "21:27:55", "iso8601_micro": "2024-09-21T01:27:55.502925Z", "iso8601": "2024-09-21T01:27:55Z", "iso8601_basic": "20240920T212755502925", "iso8601_basic_short": "20240920T212755", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7557 1726882075.51304: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 7557 1726882075.51534: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants <<< 7557 1726882075.51544: stdout chunk (state=3): >>># cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters<<< 7557 1726882075.51667: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic <<< 7557 1726882075.51740: stdout chunk (state=3): >>># destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base <<< 7557 1726882075.51757: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly <<< 7557 1726882075.51784: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 7557 1726882075.52011: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7557 1726882075.52070: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 7557 1726882075.52100: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 7557 1726882075.52225: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 7557 1726882075.52240: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 7557 1726882075.52319: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 7557 1726882075.52324: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 7557 1726882075.52383: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 7557 1726882075.52940: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7557 1726882075.52988: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 7557 1726882075.52992: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 7557 1726882075.53040: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 7557 1726882075.53066: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 7557 1726882075.53114: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 7557 1726882075.53277: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 7557 1726882075.53298: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 7557 1726882075.53328: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 7557 1726882075.53402: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 7557 1726882075.53405: stdout chunk (state=3): >>># clear sys.audit hooks <<< 7557 1726882075.54087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882075.54092: stdout chunk (state=3): >>><<< 7557 1726882075.54100: stderr chunk (state=3): >>><<< 7557 1726882075.54262: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1db684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1db37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1db6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d93d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d93dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d97bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d97bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9b37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9b3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d993aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9911c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d978f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9d3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9d2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d992090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d9d0b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da08740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d978200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da08bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da08aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da08e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d976d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da09580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da09250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da0a480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da20680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da21d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da22c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da23260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da22150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1da23ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da23410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da0a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d717bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d7406e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d740440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d740710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d741040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d741a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7408f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d715d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d742de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d741b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1da0aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d76f140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d78f500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7f02c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7f2a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7f03e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7b92e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d78e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d743d10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b1d78e900> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_9dlkelwf/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d18b0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d169fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d169130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d188f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1ba960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1ba6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1ba000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1baa50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d18bd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1bb6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1bb8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d1bbe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d025be0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d027800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d0281d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d029370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d02be30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d1bbef0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d02a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d033ce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d0327b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d032510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d032a80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d02a600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d077fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d078110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d079bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d079970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d07c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d07a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d07f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d07c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d0806b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d080890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d080a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d078320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf0c0b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf0d3a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d082870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1d083c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d082510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf11580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf122a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d7419d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf12180> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf13440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cf1dfa0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf196d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d006ab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1d0fe780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf1e2a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf1e060> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb3ffb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cb444a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cf9b200> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb2c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb0b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cb47350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb46c00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cb46de0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb46030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb47530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cbaa060> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cb47f80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cfb11c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbaa2a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbaaea0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cbe63f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbd60c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1cbf9b50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1cbe5be0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b1c9f7350> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1c9f7080> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1c9f4230> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1c9f7d10> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1ca3c560> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1ca3dd30> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b1ca3d820> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.0068359375, "5m": 0.1484375, "15m": 0.0927734375}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3037, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 494, "free": 3037}, "nocache": {"free": 3311, "used": 220}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 382, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261815549952, "block_size": 4096, "block_total": 65519099, "block_available": 63919812, "block_used": 1599287, "inode_total": 131070960, "inode_available": 131029184, "inode_used": 41776, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "27", "second": "55", "epoch": "1726882075", "epoch_int": "1726882075", "date": "2024-09-20", "time": "21:27:55", "iso8601_micro": "2024-09-21T01:27:55.502925Z", "iso8601": "2024-09-21T01:27:55Z", "iso8601_basic": "20240920T212755502925", "iso8601_basic_short": "20240920T212755", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 7557 1726882075.55587: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882075.55591: _low_level_execute_command(): starting 7557 1726882075.55596: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882074.2062354-7579-101367005246587/ > /dev/null 2>&1 && sleep 0' 7557 1726882075.55599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882075.55601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882075.55603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882075.55606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882075.55608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882075.55610: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882075.55612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.55614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882075.55659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882075.55743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882075.58127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882075.58151: stderr chunk (state=3): >>><<< 7557 1726882075.58155: stdout chunk (state=3): >>><<< 7557 1726882075.58171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882075.58178: handler run complete 7557 1726882075.58261: variable 'ansible_facts' from source: unknown 7557 1726882075.58328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882075.58521: variable 'ansible_facts' from source: unknown 7557 1726882075.58573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882075.58652: attempt loop complete, returning result 7557 1726882075.58655: _execute() done 7557 1726882075.58658: dumping result to json 7557 1726882075.58676: done dumping result, returning 7557 1726882075.58683: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-ed48-b3a5-000000000155] 7557 1726882075.58688: sending task result for task 12673a56-9f93-ed48-b3a5-000000000155 7557 1726882075.59035: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000155 7557 1726882075.59039: WORKER PROCESS EXITING ok: [managed_node3] 7557 1726882075.59486: no more pending results, returning what we have 7557 1726882075.59492: results queue empty 7557 1726882075.59498: checking for any_errors_fatal 7557 1726882075.59499: done checking for any_errors_fatal 7557 1726882075.59500: checking for max_fail_percentage 7557 1726882075.59502: done checking for max_fail_percentage 7557 1726882075.59502: checking to see if all hosts have failed and the running result is not ok 7557 1726882075.59503: done checking to see if all hosts have failed 7557 1726882075.59504: getting the remaining hosts for this loop 7557 1726882075.59505: done getting the remaining hosts for this loop 7557 1726882075.59508: getting the next task for host managed_node3 7557 1726882075.59514: done getting next task for host managed_node3 7557 1726882075.59516: ^ task is: TASK: meta (flush_handlers) 7557 1726882075.59518: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882075.59522: getting variables 7557 1726882075.59523: in VariableManager get_vars() 7557 1726882075.59544: Calling all_inventory to load vars for managed_node3 7557 1726882075.59547: Calling groups_inventory to load vars for managed_node3 7557 1726882075.59550: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882075.59559: Calling all_plugins_play to load vars for managed_node3 7557 1726882075.59562: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882075.59565: Calling groups_plugins_play to load vars for managed_node3 7557 1726882075.59778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882075.59959: done with get_vars() 7557 1726882075.59966: done getting variables 7557 1726882075.60026: in VariableManager get_vars() 7557 1726882075.60033: Calling all_inventory to load vars for managed_node3 7557 1726882075.60036: Calling groups_inventory to load vars for managed_node3 7557 1726882075.60039: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882075.60042: Calling all_plugins_play to load vars for managed_node3 7557 1726882075.60044: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882075.60045: Calling groups_plugins_play to load vars for managed_node3 7557 1726882075.60133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882075.60249: done with get_vars() 7557 1726882075.60260: done queuing things up, now waiting for results queue to drain 7557 1726882075.60261: results queue empty 7557 1726882075.60262: checking for any_errors_fatal 7557 1726882075.60263: done checking for any_errors_fatal 7557 1726882075.60268: checking for max_fail_percentage 7557 1726882075.60269: done checking for max_fail_percentage 7557 1726882075.60269: checking to see if all hosts have failed and the running result is not ok 7557 1726882075.60270: done checking to see if all hosts have failed 7557 1726882075.60270: getting the remaining hosts for this loop 7557 1726882075.60271: done getting the remaining hosts for this loop 7557 1726882075.60273: getting the next task for host managed_node3 7557 1726882075.60278: done getting next task for host managed_node3 7557 1726882075.60279: ^ task is: TASK: Include the task 'el_repo_setup.yml' 7557 1726882075.60280: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882075.60282: getting variables 7557 1726882075.60282: in VariableManager get_vars() 7557 1726882075.60288: Calling all_inventory to load vars for managed_node3 7557 1726882075.60290: Calling groups_inventory to load vars for managed_node3 7557 1726882075.60291: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882075.60296: Calling all_plugins_play to load vars for managed_node3 7557 1726882075.60298: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882075.60300: Calling groups_plugins_play to load vars for managed_node3 7557 1726882075.60422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882075.60591: done with get_vars() 7557 1726882075.60600: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:11 Friday 20 September 2024 21:27:55 -0400 (0:00:01.440) 0:00:01.459 ****** 7557 1726882075.60672: entering _queue_task() for managed_node3/include_tasks 7557 1726882075.60675: Creating lock for include_tasks 7557 1726882075.60960: worker is 1 (out of 1 available) 7557 1726882075.60972: exiting _queue_task() for managed_node3/include_tasks 7557 1726882075.60984: done queuing things up, now waiting for results queue to drain 7557 1726882075.60985: waiting for pending results... 7557 1726882075.61525: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 7557 1726882075.61530: in run() - task 12673a56-9f93-ed48-b3a5-000000000006 7557 1726882075.61533: variable 'ansible_search_path' from source: unknown 7557 1726882075.61538: calling self._execute() 7557 1726882075.61541: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882075.61545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882075.61547: variable 'omit' from source: magic vars 7557 1726882075.61571: _execute() done 7557 1726882075.61574: dumping result to json 7557 1726882075.61577: done dumping result, returning 7557 1726882075.61584: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-ed48-b3a5-000000000006] 7557 1726882075.61594: sending task result for task 12673a56-9f93-ed48-b3a5-000000000006 7557 1726882075.61691: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000006 7557 1726882075.61743: WORKER PROCESS EXITING 7557 1726882075.61794: no more pending results, returning what we have 7557 1726882075.61799: in VariableManager get_vars() 7557 1726882075.61825: Calling all_inventory to load vars for managed_node3 7557 1726882075.61827: Calling groups_inventory to load vars for managed_node3 7557 1726882075.61830: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882075.61840: Calling all_plugins_play to load vars for managed_node3 7557 1726882075.61842: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882075.61845: Calling groups_plugins_play to load vars for managed_node3 7557 1726882075.61997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882075.62102: done with get_vars() 7557 1726882075.62107: variable 'ansible_search_path' from source: unknown 7557 1726882075.62117: we have included files to process 7557 1726882075.62117: generating all_blocks data 7557 1726882075.62118: done generating all_blocks data 7557 1726882075.62118: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7557 1726882075.62119: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7557 1726882075.62121: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7557 1726882075.62604: in VariableManager get_vars() 7557 1726882075.62627: done with get_vars() 7557 1726882075.62637: done processing included file 7557 1726882075.62639: iterating over new_blocks loaded from include file 7557 1726882075.62648: in VariableManager get_vars() 7557 1726882075.62669: done with get_vars() 7557 1726882075.62670: filtering new block on tags 7557 1726882075.62684: done filtering new block on tags 7557 1726882075.62687: in VariableManager get_vars() 7557 1726882075.62699: done with get_vars() 7557 1726882075.62701: filtering new block on tags 7557 1726882075.62715: done filtering new block on tags 7557 1726882075.62718: in VariableManager get_vars() 7557 1726882075.62728: done with get_vars() 7557 1726882075.62729: filtering new block on tags 7557 1726882075.62743: done filtering new block on tags 7557 1726882075.62745: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 7557 1726882075.62750: extending task lists for all hosts with included blocks 7557 1726882075.62798: done extending task lists 7557 1726882075.62800: done processing included files 7557 1726882075.62801: results queue empty 7557 1726882075.62802: checking for any_errors_fatal 7557 1726882075.62803: done checking for any_errors_fatal 7557 1726882075.62804: checking for max_fail_percentage 7557 1726882075.62805: done checking for max_fail_percentage 7557 1726882075.62805: checking to see if all hosts have failed and the running result is not ok 7557 1726882075.62806: done checking to see if all hosts have failed 7557 1726882075.62807: getting the remaining hosts for this loop 7557 1726882075.62808: done getting the remaining hosts for this loop 7557 1726882075.62811: getting the next task for host managed_node3 7557 1726882075.62815: done getting next task for host managed_node3 7557 1726882075.62818: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 7557 1726882075.62820: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882075.62822: getting variables 7557 1726882075.62823: in VariableManager get_vars() 7557 1726882075.62831: Calling all_inventory to load vars for managed_node3 7557 1726882075.62834: Calling groups_inventory to load vars for managed_node3 7557 1726882075.62836: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882075.62841: Calling all_plugins_play to load vars for managed_node3 7557 1726882075.62843: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882075.62846: Calling groups_plugins_play to load vars for managed_node3 7557 1726882075.63000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882075.63173: done with get_vars() 7557 1726882075.63181: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:27:55 -0400 (0:00:00.025) 0:00:01.485 ****** 7557 1726882075.63246: entering _queue_task() for managed_node3/setup 7557 1726882075.63572: worker is 1 (out of 1 available) 7557 1726882075.63583: exiting _queue_task() for managed_node3/setup 7557 1726882075.63712: done queuing things up, now waiting for results queue to drain 7557 1726882075.63714: waiting for pending results... 7557 1726882075.63953: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 7557 1726882075.63984: in run() - task 12673a56-9f93-ed48-b3a5-000000000166 7557 1726882075.64102: variable 'ansible_search_path' from source: unknown 7557 1726882075.64109: variable 'ansible_search_path' from source: unknown 7557 1726882075.64113: calling self._execute() 7557 1726882075.64137: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882075.64149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882075.64162: variable 'omit' from source: magic vars 7557 1726882075.64649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882075.66813: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882075.66882: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882075.66937: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882075.66976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882075.67012: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882075.67138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882075.67142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882075.67172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882075.67221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882075.67249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882075.67619: variable 'ansible_facts' from source: unknown 7557 1726882075.67742: variable 'network_test_required_facts' from source: task vars 7557 1726882075.67769: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 7557 1726882075.67780: variable 'omit' from source: magic vars 7557 1726882075.67839: variable 'omit' from source: magic vars 7557 1726882075.67903: variable 'omit' from source: magic vars 7557 1726882075.67915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882075.67946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882075.68230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882075.68234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882075.68237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882075.68239: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882075.68241: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882075.68243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882075.68415: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882075.68442: Set connection var ansible_shell_executable to /bin/sh 7557 1726882075.68473: Set connection var ansible_shell_type to sh 7557 1726882075.68491: Set connection var ansible_pipelining to False 7557 1726882075.68504: Set connection var ansible_connection to ssh 7557 1726882075.68514: Set connection var ansible_timeout to 10 7557 1726882075.68545: variable 'ansible_shell_executable' from source: unknown 7557 1726882075.68553: variable 'ansible_connection' from source: unknown 7557 1726882075.68560: variable 'ansible_module_compression' from source: unknown 7557 1726882075.68565: variable 'ansible_shell_type' from source: unknown 7557 1726882075.68573: variable 'ansible_shell_executable' from source: unknown 7557 1726882075.68576: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882075.68579: variable 'ansible_pipelining' from source: unknown 7557 1726882075.68581: variable 'ansible_timeout' from source: unknown 7557 1726882075.68645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882075.68736: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882075.68764: variable 'omit' from source: magic vars 7557 1726882075.68774: starting attempt loop 7557 1726882075.68782: running the handler 7557 1726882075.68802: _low_level_execute_command(): starting 7557 1726882075.68815: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882075.69328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882075.69337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.69360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882075.69363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882075.69365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.69409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882075.69422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882075.69483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882075.71767: stdout chunk (state=3): >>>/root <<< 7557 1726882075.71944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882075.71949: stdout chunk (state=3): >>><<< 7557 1726882075.71955: stderr chunk (state=3): >>><<< 7557 1726882075.71999: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7557 1726882075.72013: _low_level_execute_command(): starting 7557 1726882075.72017: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470 `" && echo ansible-tmp-1726882075.7197113-7662-281360842359470="` echo /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470 `" ) && sleep 0' 7557 1726882075.72564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882075.72567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882075.72570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.72572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882075.72574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.72624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882075.72628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882075.72684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882075.75330: stdout chunk (state=3): >>>ansible-tmp-1726882075.7197113-7662-281360842359470=/root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470 <<< 7557 1726882075.75454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882075.75481: stderr chunk (state=3): >>><<< 7557 1726882075.75484: stdout chunk (state=3): >>><<< 7557 1726882075.75499: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882075.7197113-7662-281360842359470=/root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7557 1726882075.75536: variable 'ansible_module_compression' from source: unknown 7557 1726882075.75578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7557 1726882075.75638: variable 'ansible_facts' from source: unknown 7557 1726882075.75810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/AnsiballZ_setup.py 7557 1726882075.76018: Sending initial data 7557 1726882075.76022: Sent initial data (152 bytes) 7557 1726882075.76699: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882075.76702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882075.76705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882075.76709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.76777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882075.76781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882075.76783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882075.76856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882075.79110: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882075.79157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882075.79199: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpbuzovtdp /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/AnsiballZ_setup.py <<< 7557 1726882075.79203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/AnsiballZ_setup.py" <<< 7557 1726882075.79262: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpbuzovtdp" to remote "/root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/AnsiballZ_setup.py" <<< 7557 1726882075.81112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882075.81198: stderr chunk (state=3): >>><<< 7557 1726882075.81202: stdout chunk (state=3): >>><<< 7557 1726882075.81204: done transferring module to remote 7557 1726882075.81206: _low_level_execute_command(): starting 7557 1726882075.81208: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/ /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/AnsiballZ_setup.py && sleep 0' 7557 1726882075.81603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882075.81609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882075.81627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.81681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882075.81684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882075.81736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882075.84239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882075.84264: stderr chunk (state=3): >>><<< 7557 1726882075.84266: stdout chunk (state=3): >>><<< 7557 1726882075.84281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7557 1726882075.84297: _low_level_execute_command(): starting 7557 1726882075.84299: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/AnsiballZ_setup.py && sleep 0' 7557 1726882075.84718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882075.84721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882075.84723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.84725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882075.84727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882075.84777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882075.84783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882075.84843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882075.88012: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # <<< 7557 1726882075.88041: stdout chunk (state=3): >>> import '_weakref' # <<< 7557 1726882075.88124: stdout chunk (state=3): >>>import '_io' # <<< 7557 1726882075.88128: stdout chunk (state=3): >>> <<< 7557 1726882075.88192: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # <<< 7557 1726882075.88198: stdout chunk (state=3): >>> <<< 7557 1726882075.88265: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7557 1726882075.88268: stdout chunk (state=3): >>># installing zipimport hook<<< 7557 1726882075.88284: stdout chunk (state=3): >>> <<< 7557 1726882075.88307: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 7557 1726882075.88359: stdout chunk (state=3): >>> # installed zipimport hook <<< 7557 1726882075.88411: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 7557 1726882075.88414: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 7557 1726882075.88440: stdout chunk (state=3): >>> <<< 7557 1726882075.88496: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 7557 1726882075.88511: stdout chunk (state=3): >>> <<< 7557 1726882075.88546: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 7557 1726882075.88583: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 7557 1726882075.88651: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf520184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51fe7b30><<< 7557 1726882075.88692: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 7557 1726882075.88721: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5201aa50><<< 7557 1726882075.88779: stdout chunk (state=3): >>> import '_signal' # <<< 7557 1726882075.88787: stdout chunk (state=3): >>> <<< 7557 1726882075.88803: stdout chunk (state=3): >>>import '_abc' # <<< 7557 1726882075.88838: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 7557 1726882075.88845: stdout chunk (state=3): >>> <<< 7557 1726882075.88883: stdout chunk (state=3): >>>import '_stat' # <<< 7557 1726882075.89016: stdout chunk (state=3): >>> import 'stat' # import '_collections_abc' # <<< 7557 1726882075.89021: stdout chunk (state=3): >>> <<< 7557 1726882075.89070: stdout chunk (state=3): >>>import 'genericpath' # <<< 7557 1726882075.89073: stdout chunk (state=3): >>> <<< 7557 1726882075.89098: stdout chunk (state=3): >>>import 'posixpath' # <<< 7557 1726882075.89144: stdout chunk (state=3): >>>import 'os' # <<< 7557 1726882075.89150: stdout chunk (state=3): >>> <<< 7557 1726882075.89179: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 7557 1726882075.89210: stdout chunk (state=3): >>>Processing user site-packages <<< 7557 1726882075.89233: stdout chunk (state=3): >>>Processing global site-packages <<< 7557 1726882075.89320: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages'<<< 7557 1726882075.89335: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 7557 1726882075.89455: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51de9130> <<< 7557 1726882075.89502: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 7557 1726882075.89529: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.89547: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51de9fa0> <<< 7557 1726882075.89594: stdout chunk (state=3): >>>import 'site' # <<< 7557 1726882075.89636: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7557 1726882075.90244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 7557 1726882075.90272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7557 1726882075.90303: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 7557 1726882075.90323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.90554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e27e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e27f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 7557 1726882075.90578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 7557 1726882075.90612: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7557 1726882075.90681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.90712: stdout chunk (state=3): >>>import 'itertools' # <<< 7557 1726882075.90745: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 7557 1726882075.90765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e5f890> <<< 7557 1726882075.90853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 7557 1726882075.90856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e5ff20> import '_collections' # <<< 7557 1726882075.90900: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e3fb30> <<< 7557 1726882075.90903: stdout chunk (state=3): >>>import '_functools' # <<< 7557 1726882075.90928: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e3d250> <<< 7557 1726882075.91076: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e25010> <<< 7557 1726882075.91088: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 7557 1726882075.91128: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 7557 1726882075.91147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 7557 1726882075.91150: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7557 1726882075.91187: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e7f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e7e450> <<< 7557 1726882075.91228: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 7557 1726882075.91239: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e3e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e7ccb0> <<< 7557 1726882075.91305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb4860> <<< 7557 1726882075.91350: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e24290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 7557 1726882075.91354: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51eb4d10> <<< 7557 1726882075.91388: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb4bc0> <<< 7557 1726882075.91401: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51eb4fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e22db0> <<< 7557 1726882075.91441: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 7557 1726882075.91453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 7557 1726882075.91474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 7557 1726882075.91506: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb56a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb5370> import 'importlib.machinery' # <<< 7557 1726882075.91536: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 7557 1726882075.91568: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb65a0> import 'importlib.util' # import 'runpy' # <<< 7557 1726882075.91599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 7557 1726882075.91632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 7557 1726882075.91655: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 7557 1726882075.91688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51ecc7a0> import 'errno' # <<< 7557 1726882075.91719: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.91749: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51ecde80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 7557 1726882075.91773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 7557 1726882075.91814: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eced20> <<< 7557 1726882075.91846: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51ecf320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51ece270> <<< 7557 1726882075.91860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 7557 1726882075.91903: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51ecfda0> <<< 7557 1726882075.91922: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51ecf4d0> <<< 7557 1726882075.91946: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb6510> <<< 7557 1726882075.91954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 7557 1726882075.91985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 7557 1726882075.92006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 7557 1726882075.92011: stdout chunk (state=3): >>> <<< 7557 1726882075.92260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bc3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bec740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51bec4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bec680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.92553: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51becfe0> <<< 7557 1726882075.92592: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.92611: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bed910> <<< 7557 1726882075.92620: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51bec8c0> <<< 7557 1726882075.92654: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51bc1d90> <<< 7557 1726882075.92685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7557 1726882075.92711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 7557 1726882075.92741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 7557 1726882075.92761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 7557 1726882075.92784: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51beed20> <<< 7557 1726882075.92814: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51beda60> <<< 7557 1726882075.92845: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb6750> <<< 7557 1726882075.92908: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7557 1726882075.92965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.92999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 7557 1726882075.93041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 7557 1726882075.93085: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c17080> <<< 7557 1726882075.93159: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 7557 1726882075.93179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.93251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7557 1726882075.93296: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c3b440> <<< 7557 1726882075.93326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 7557 1726882075.93392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7557 1726882075.93659: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c9c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7557 1726882075.93755: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c9e990> <<< 7557 1726882075.93869: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c9c350> <<< 7557 1726882075.93921: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c69250> <<< 7557 1726882075.93959: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 7557 1726882075.93971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 7557 1726882075.93977: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51525310> <<< 7557 1726882075.94011: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c3a240> <<< 7557 1726882075.94017: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51befc50> <<< 7557 1726882075.94297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 7557 1726882075.94326: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faf515255b0> <<< 7557 1726882075.94816: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_9g3p2sq7/ansible_setup_payload.zip' <<< 7557 1726882075.94824: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.94971: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.95004: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 7557 1726882075.95048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7557 1726882075.95174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 7557 1726882075.95178: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 7557 1726882075.95181: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5158efc0> import '_typing' # <<< 7557 1726882075.95347: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5156deb0> <<< 7557 1726882075.95361: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5156d0a0> # zipimport: zlib available <<< 7557 1726882075.95403: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 7557 1726882075.95408: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.95451: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 7557 1726882075.95454: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.97438: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882075.98888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5158d2b0> <<< 7557 1726882075.98940: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882075.98956: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 7557 1726882075.98978: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 7557 1726882075.99012: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882075.99037: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf515be990> <<< 7557 1726882075.99058: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515be720> <<< 7557 1726882075.99084: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515be030> <<< 7557 1726882075.99111: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 7557 1726882075.99157: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515be480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5158fc50> <<< 7557 1726882075.99189: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf515bf710> <<< 7557 1726882075.99228: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf515bf920> <<< 7557 1726882075.99247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 7557 1726882075.99300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 7557 1726882075.99386: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515bfe60> <<< 7557 1726882075.99437: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51429c70> <<< 7557 1726882075.99466: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5142b890> <<< 7557 1726882075.99555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 7557 1726882075.99558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142c230> <<< 7557 1726882075.99651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142d3a0> <<< 7557 1726882075.99717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7557 1726882075.99760: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142fe00> <<< 7557 1726882075.99825: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51e22ea0> <<< 7557 1726882075.99866: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 7557 1726882075.99883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 7557 1726882075.99916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7557 1726882076.00008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 7557 1726882076.00056: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 7557 1726882076.00076: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51437e30> import '_tokenize' # <<< 7557 1726882076.00148: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51436900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51436660> <<< 7557 1726882076.00359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51436bd0> <<< 7557 1726882076.00362: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5147bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 7557 1726882076.00408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7557 1726882076.00411: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5147dc70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147da30> <<< 7557 1726882076.00428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7557 1726882076.00465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7557 1726882076.00515: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.00548: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51480230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 7557 1726882076.00611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 7557 1726882076.00622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 7557 1726882076.00685: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf514839e0> <<< 7557 1726882076.00907: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf514803e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.00916: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51484ad0> <<< 7557 1726882076.00959: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51484bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51484c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 7557 1726882076.01200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 7557 1726882076.01247: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5130c2f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5130d100> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51486ae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51487e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf514866f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 7557 1726882076.01262: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.01347: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.01438: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.01478: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.01499: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 7557 1726882076.01604: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.01718: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.02256: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.02797: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 7557 1726882076.02830: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 7557 1726882076.02846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.02894: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51315370> <<< 7557 1726882076.02969: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 7557 1726882076.03004: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51316210> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5130ff20> <<< 7557 1726882076.03037: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 7557 1726882076.03064: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.03091: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 7557 1726882076.03111: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.03242: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.03408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51316900> <<< 7557 1726882076.03411: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.03907: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.04297: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.04396: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.04431: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 7557 1726882076.04596: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 7557 1726882076.04619: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.04668: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 7557 1726882076.04702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 7557 1726882076.04736: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.04783: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.04837: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 7557 1726882076.05030: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05237: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7557 1726882076.05298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 7557 1726882076.05310: stdout chunk (state=3): >>>import '_ast' # <<< 7557 1726882076.05379: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513174a0> # zipimport: zlib available <<< 7557 1726882076.05445: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05528: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 7557 1726882076.05547: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 7557 1726882076.05556: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05602: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05635: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 7557 1726882076.05645: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05685: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05726: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05780: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.05846: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 7557 1726882076.05882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.05962: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51321f10> <<< 7557 1726882076.06021: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5131d700> <<< 7557 1726882076.06043: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 7557 1726882076.06134: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.06267: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.06274: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 7557 1726882076.06428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 7557 1726882076.06431: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 7557 1726882076.06465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5140a8d0> <<< 7557 1726882076.06476: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515ea5a0> <<< 7557 1726882076.06711: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51322030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51486b10> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.06743: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 7557 1726882076.06809: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.06886: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.06955: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.07005: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.07066: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.07349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 7557 1726882076.07353: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 7557 1726882076.07412: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.07592: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.07621: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.07687: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.07718: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 7557 1726882076.07734: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 7557 1726882076.07768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 7557 1726882076.07806: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b1e80> <<< 7557 1726882076.07830: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 7557 1726882076.07871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 7557 1726882076.08021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50febfe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50ff0320> <<< 7557 1726882076.08047: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513987a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b29c0> <<< 7557 1726882076.08156: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b05f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b01a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 7557 1726882076.08186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 7557 1726882076.08230: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50ff3260> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff2b10> <<< 7557 1726882076.08264: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50ff2cc0> <<< 7557 1726882076.08295: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff1f40> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 7557 1726882076.08445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff33e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 7557 1726882076.08514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 7557 1726882076.08529: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5104dee0> <<< 7557 1726882076.08592: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff3ec0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b0200> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 7557 1726882076.08650: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 7557 1726882076.08675: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.08742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 7557 1726882076.08745: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.08921: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.08925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.08951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 7557 1726882076.09016: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.09044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 7557 1726882076.09064: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.09102: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.09260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 7557 1726882076.09263: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.09265: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.09315: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.09412: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 7557 1726882076.09415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 7557 1726882076.09865: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.10305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 7557 1726882076.10767: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 7557 1726882076.10774: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.10776: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.10877: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.10963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 7557 1726882076.11005: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5104f770> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 7557 1726882076.11097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 7557 1726882076.11148: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5104e960> <<< 7557 1726882076.11168: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 7557 1726882076.11226: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.11299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 7557 1726882076.11449: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.11472: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 7557 1726882076.11499: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.11626: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 7557 1726882076.11669: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.11709: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 7557 1726882076.11874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 7557 1726882076.11956: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5108e0f0> <<< 7557 1726882076.12099: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5107dee0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 7557 1726882076.12206: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 7557 1726882076.12277: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.12543: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.12614: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 7557 1726882076.12662: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.12700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 7557 1726882076.12714: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.12748: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.12789: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 7557 1726882076.12998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf510a1cd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5107f140> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 7557 1726882076.13206: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.13213: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 7557 1726882076.13234: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.13277: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 7557 1726882076.13304: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.13498: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.13501: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.13529: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.13570: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 7557 1726882076.13637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 7557 1726882076.13752: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.13769: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.13907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 7557 1726882076.13924: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.14031: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.14152: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 7557 1726882076.14284: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.14301: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.14771: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.15281: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 7557 1726882076.15289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 7557 1726882076.15398: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.15510: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 7557 1726882076.15586: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.15689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 7557 1726882076.15842: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.16273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 7557 1726882076.16298: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.16441: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.16759: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.17076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 7557 1726882076.17095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 7557 1726882076.17184: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 7557 1726882076.17205: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.17227: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.17273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 7557 1726882076.17454: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # <<< 7557 1726882076.17512: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 7557 1726882076.17617: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.17691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 7557 1726882076.17697: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.17769: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.17847: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 7557 1726882076.17858: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.18335: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.18633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 7557 1726882076.18650: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.18710: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.18788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 7557 1726882076.18815: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19012: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.19101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 7557 1726882076.19178: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19180: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 7557 1726882076.19263: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 7557 1726882076.19312: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 7557 1726882076.19413: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19416: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19436: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19483: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19524: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19619: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19672: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 7557 1726882076.19694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 7557 1726882076.19734: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 7557 1726882076.19803: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.19976: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.20204: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 7557 1726882076.20243: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.20301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 7557 1726882076.20522: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.20634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 7557 1726882076.20781: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.20897: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 7557 1726882076.21004: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.21811: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py<<< 7557 1726882076.21820: stdout chunk (state=3): >>> <<< 7557 1726882076.21841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc'<<< 7557 1726882076.21877: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 7557 1726882076.21911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc'<<< 7557 1726882076.21965: stdout chunk (state=3): >>> # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 7557 1726882076.21979: stdout chunk (state=3): >>> <<< 7557 1726882076.21982: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 7557 1726882076.21988: stdout chunk (state=3): >>> <<< 7557 1726882076.22023: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50e9f560> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50e9eba0><<< 7557 1726882076.22029: stdout chunk (state=3): >>> <<< 7557 1726882076.22196: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50e9f650> <<< 7557 1726882076.23116: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "27", "second": "56", "epoch": "1726882076", "epoch_int": "1726882076", "date": "2024-09-20", "time": "21:27:56", "iso8601_micro": "2024-09-21T01:27:56.216458Z", "iso8601": "2024-09-21T01:27:56Z", "iso8601_basic": "20240920T212756216458", "iso8601_basic_short": "20240920T212756", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG<<< 7557 1726882076.23121: stdout chunk (state=3): >>>_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7557 1726882076.23883: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ <<< 7557 1726882076.23900: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible <<< 7557 1726882076.23911: stdout chunk (state=3): >>># destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections<<< 7557 1726882076.23915: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast <<< 7557 1726882076.23918: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 7557 1726882076.24011: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob <<< 7557 1726882076.24015: stdout chunk (state=3): >>># cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix <<< 7557 1726882076.24022: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system <<< 7557 1726882076.24078: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 7557 1726882076.24581: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 7557 1726882076.24607: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 7557 1726882076.24664: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 7557 1726882076.24681: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 7557 1726882076.24721: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 7557 1726882076.24760: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 7557 1726882076.24780: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 7557 1726882076.24798: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 7557 1726882076.24811: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 7557 1726882076.24837: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 7557 1726882076.24858: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 7557 1726882076.24864: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 7557 1726882076.24902: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 7557 1726882076.24913: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 7557 1726882076.25004: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 7557 1726882076.25016: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 7557 1726882076.25037: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 7557 1726882076.25043: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 7557 1726882076.25063: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 7557 1726882076.25087: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 7557 1726882076.25092: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 7557 1726882076.25097: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 7557 1726882076.25109: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7557 1726882076.25284: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 7557 1726882076.25356: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 7557 1726882076.25397: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 7557 1726882076.25436: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 7557 1726882076.25549: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 7557 1726882076.25586: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random <<< 7557 1726882076.25607: stdout chunk (state=3): >>># destroy _weakref # destroy _hashlib <<< 7557 1726882076.25639: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 7557 1726882076.25660: stdout chunk (state=3): >>># destroy _abc # destroy posix <<< 7557 1726882076.25711: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread <<< 7557 1726882076.25722: stdout chunk (state=3): >>># clear sys.audit hooks <<< 7557 1726882076.26223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882076.26226: stdout chunk (state=3): >>><<< 7557 1726882076.26254: stderr chunk (state=3): >>><<< 7557 1726882076.26364: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf520184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51fe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5201aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51de9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51de9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e27e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e27f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e5f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e5ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e3fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e3d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e25010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e7f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e7e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e3e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e7ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb4860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e24290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51eb4d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb4bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51eb4fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51e22db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb56a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb5370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb65a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51ecc7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51ecde80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eced20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51ecf320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51ece270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51ecfda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51ecf4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb6510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bc3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bec740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51bec4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bec680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51becfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51bed910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51bec8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51bc1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51beed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51beda60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51eb6750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c17080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c3b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c9c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c9e990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c9c350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c69250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51525310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51c3a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51befc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7faf515255b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_9g3p2sq7/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5158efc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5156deb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5156d0a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5158d2b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf515be990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515be720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515be030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515be480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5158fc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf515bf710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf515bf920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515bfe60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51429c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5142b890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142d3a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51e22ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51437e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51436900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51436660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51436bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5142e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5147bf20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5147dc70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147da30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51480230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147e360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf514839e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf514803e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51484ad0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51484bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51484c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5147c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5130c2f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5130d100> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51486ae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51487e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf514866f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51315370> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51316210> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5130ff20> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51316900> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513174a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf51321f10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5131d700> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5140a8d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf515ea5a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51322030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf51486b10> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b1e80> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50febfe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50ff0320> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513987a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b29c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b05f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b01a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50ff3260> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff2b10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50ff2cc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff1f40> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff33e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5104dee0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50ff3ec0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf513b0200> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5104f770> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5104e960> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf5108e0f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5107dee0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf510a1cd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf5107f140> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7faf50e9f560> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50e9eba0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7faf50e9f650> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "27", "second": "56", "epoch": "1726882076", "epoch_int": "1726882076", "date": "2024-09-20", "time": "21:27:56", "iso8601_micro": "2024-09-21T01:27:56.216458Z", "iso8601": "2024-09-21T01:27:56Z", "iso8601_basic": "20240920T212756216458", "iso8601_basic_short": "20240920T212756", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 7557 1726882076.27365: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882076.27368: _low_level_execute_command(): starting 7557 1726882076.27371: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882075.7197113-7662-281360842359470/ > /dev/null 2>&1 && sleep 0' 7557 1726882076.27373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882076.27376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882076.27570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882076.27576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882076.27581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882076.27583: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882076.27585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.27588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882076.27589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882076.27592: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882076.27596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882076.27598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.27600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882076.27602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882076.27631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882076.27717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882076.30223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882076.30239: stderr chunk (state=3): >>><<< 7557 1726882076.30243: stdout chunk (state=3): >>><<< 7557 1726882076.30257: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7557 1726882076.30262: handler run complete 7557 1726882076.30289: variable 'ansible_facts' from source: unknown 7557 1726882076.30331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.30397: variable 'ansible_facts' from source: unknown 7557 1726882076.30439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.30469: attempt loop complete, returning result 7557 1726882076.30472: _execute() done 7557 1726882076.30474: dumping result to json 7557 1726882076.30482: done dumping result, returning 7557 1726882076.30491: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-ed48-b3a5-000000000166] 7557 1726882076.30499: sending task result for task 12673a56-9f93-ed48-b3a5-000000000166 7557 1726882076.30626: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000166 7557 1726882076.30628: WORKER PROCESS EXITING ok: [managed_node3] 7557 1726882076.30729: no more pending results, returning what we have 7557 1726882076.30731: results queue empty 7557 1726882076.30732: checking for any_errors_fatal 7557 1726882076.30733: done checking for any_errors_fatal 7557 1726882076.30734: checking for max_fail_percentage 7557 1726882076.30735: done checking for max_fail_percentage 7557 1726882076.30736: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.30738: done checking to see if all hosts have failed 7557 1726882076.30739: getting the remaining hosts for this loop 7557 1726882076.30740: done getting the remaining hosts for this loop 7557 1726882076.30743: getting the next task for host managed_node3 7557 1726882076.30751: done getting next task for host managed_node3 7557 1726882076.30753: ^ task is: TASK: Check if system is ostree 7557 1726882076.30756: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.30759: getting variables 7557 1726882076.30760: in VariableManager get_vars() 7557 1726882076.30788: Calling all_inventory to load vars for managed_node3 7557 1726882076.30791: Calling groups_inventory to load vars for managed_node3 7557 1726882076.30804: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.30818: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.30821: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.30824: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.31017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.31207: done with get_vars() 7557 1726882076.31219: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:27:56 -0400 (0:00:00.680) 0:00:02.166 ****** 7557 1726882076.31318: entering _queue_task() for managed_node3/stat 7557 1726882076.31548: worker is 1 (out of 1 available) 7557 1726882076.31561: exiting _queue_task() for managed_node3/stat 7557 1726882076.31572: done queuing things up, now waiting for results queue to drain 7557 1726882076.31574: waiting for pending results... 7557 1726882076.31827: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 7557 1726882076.31903: in run() - task 12673a56-9f93-ed48-b3a5-000000000168 7557 1726882076.31916: variable 'ansible_search_path' from source: unknown 7557 1726882076.31932: variable 'ansible_search_path' from source: unknown 7557 1726882076.31964: calling self._execute() 7557 1726882076.32024: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.32029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.32037: variable 'omit' from source: magic vars 7557 1726882076.32399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882076.32555: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882076.32589: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882076.32618: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882076.32642: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882076.32732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882076.32749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882076.32767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882076.32788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882076.32877: Evaluated conditional (not __network_is_ostree is defined): True 7557 1726882076.32880: variable 'omit' from source: magic vars 7557 1726882076.32912: variable 'omit' from source: magic vars 7557 1726882076.32936: variable 'omit' from source: magic vars 7557 1726882076.32954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882076.32975: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882076.32988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882076.33008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882076.33017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882076.33038: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882076.33041: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.33043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.33114: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882076.33120: Set connection var ansible_shell_executable to /bin/sh 7557 1726882076.33123: Set connection var ansible_shell_type to sh 7557 1726882076.33128: Set connection var ansible_pipelining to False 7557 1726882076.33130: Set connection var ansible_connection to ssh 7557 1726882076.33135: Set connection var ansible_timeout to 10 7557 1726882076.33150: variable 'ansible_shell_executable' from source: unknown 7557 1726882076.33152: variable 'ansible_connection' from source: unknown 7557 1726882076.33155: variable 'ansible_module_compression' from source: unknown 7557 1726882076.33157: variable 'ansible_shell_type' from source: unknown 7557 1726882076.33160: variable 'ansible_shell_executable' from source: unknown 7557 1726882076.33162: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.33166: variable 'ansible_pipelining' from source: unknown 7557 1726882076.33168: variable 'ansible_timeout' from source: unknown 7557 1726882076.33172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.33274: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882076.33281: variable 'omit' from source: magic vars 7557 1726882076.33286: starting attempt loop 7557 1726882076.33289: running the handler 7557 1726882076.33305: _low_level_execute_command(): starting 7557 1726882076.33311: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882076.33889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882076.33896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882076.33901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.33958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882076.33990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882076.36213: stdout chunk (state=3): >>>/root <<< 7557 1726882076.36349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882076.36373: stderr chunk (state=3): >>><<< 7557 1726882076.36376: stdout chunk (state=3): >>><<< 7557 1726882076.36398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7557 1726882076.36409: _low_level_execute_command(): starting 7557 1726882076.36412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073 `" && echo ansible-tmp-1726882076.3639536-7708-43935943869073="` echo /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073 `" ) && sleep 0' 7557 1726882076.36827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882076.36831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882076.36833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.36836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882076.36838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.36886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882076.36890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882076.36946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882076.39514: stdout chunk (state=3): >>>ansible-tmp-1726882076.3639536-7708-43935943869073=/root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073 <<< 7557 1726882076.39669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882076.39695: stderr chunk (state=3): >>><<< 7557 1726882076.39699: stdout chunk (state=3): >>><<< 7557 1726882076.39710: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882076.3639536-7708-43935943869073=/root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7557 1726882076.39773: variable 'ansible_module_compression' from source: unknown 7557 1726882076.39809: ANSIBALLZ: Using lock for stat 7557 1726882076.39812: ANSIBALLZ: Acquiring lock 7557 1726882076.39815: ANSIBALLZ: Lock acquired: 140194287015008 7557 1726882076.39817: ANSIBALLZ: Creating module 7557 1726882076.50301: ANSIBALLZ: Writing module into payload 7557 1726882076.50316: ANSIBALLZ: Writing module 7557 1726882076.50350: ANSIBALLZ: Renaming module 7557 1726882076.50362: ANSIBALLZ: Done creating module 7557 1726882076.50384: variable 'ansible_facts' from source: unknown 7557 1726882076.50468: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/AnsiballZ_stat.py 7557 1726882076.50692: Sending initial data 7557 1726882076.50908: Sent initial data (150 bytes) 7557 1726882076.51181: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882076.51186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882076.51216: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.51221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882076.51224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.51275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882076.51286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882076.51349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882076.53521: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882076.53601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882076.53651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpadyrmb5j /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/AnsiballZ_stat.py <<< 7557 1726882076.53654: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/AnsiballZ_stat.py" <<< 7557 1726882076.53719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpadyrmb5j" to remote "/root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/AnsiballZ_stat.py" <<< 7557 1726882076.54703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882076.54734: stderr chunk (state=3): >>><<< 7557 1726882076.54820: stdout chunk (state=3): >>><<< 7557 1726882076.54840: done transferring module to remote 7557 1726882076.54866: _low_level_execute_command(): starting 7557 1726882076.54877: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/ /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/AnsiballZ_stat.py && sleep 0' 7557 1726882076.55783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882076.55814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882076.55912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7557 1726882076.58047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882076.58052: stdout chunk (state=3): >>><<< 7557 1726882076.58054: stderr chunk (state=3): >>><<< 7557 1726882076.58057: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7557 1726882076.58059: _low_level_execute_command(): starting 7557 1726882076.58061: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/AnsiballZ_stat.py && sleep 0' 7557 1726882076.58667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882076.58671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882076.58708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.58767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882076.58795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882076.58888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882076.60975: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 7557 1726882076.61005: stdout chunk (state=3): >>>import _imp # builtin <<< 7557 1726882076.61038: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 7557 1726882076.61089: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7557 1726882076.61138: stdout chunk (state=3): >>>import 'posix' # <<< 7557 1726882076.61165: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 7557 1726882076.61199: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 7557 1726882076.61256: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 7557 1726882076.61270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 7557 1726882076.61304: stdout chunk (state=3): >>>import 'codecs' # <<< 7557 1726882076.61339: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 7557 1726882076.61358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce6dc4d0> <<< 7557 1726882076.61398: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce6abb00> <<< 7557 1726882076.61424: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce6dea50> <<< 7557 1726882076.61458: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 7557 1726882076.61477: stdout chunk (state=3): >>>import 'abc' # <<< 7557 1726882076.61487: stdout chunk (state=3): >>>import 'io' # <<< 7557 1726882076.61518: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 7557 1726882076.61602: stdout chunk (state=3): >>>import '_collections_abc' # <<< 7557 1726882076.61660: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 7557 1726882076.61678: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 7557 1726882076.61717: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 7557 1726882076.61720: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 7557 1726882076.61772: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 7557 1726882076.61775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 7557 1726882076.61790: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce491130> <<< 7557 1726882076.61838: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 7557 1726882076.61866: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce491fa0> <<< 7557 1726882076.61887: stdout chunk (state=3): >>>import 'site' # <<< 7557 1726882076.61914: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7557 1726882076.62146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 7557 1726882076.62168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 7557 1726882076.62201: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 7557 1726882076.62216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 7557 1726882076.62242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 7557 1726882076.62263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 7557 1726882076.62298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 7557 1726882076.62345: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cfe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 7557 1726882076.62350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 7557 1726882076.62383: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cff50> <<< 7557 1726882076.62386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 7557 1726882076.62467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 7557 1726882076.62492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.62520: stdout chunk (state=3): >>>import 'itertools' # <<< 7557 1726882076.62562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 7557 1726882076.62583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce5078c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce507f50> <<< 7557 1726882076.62600: stdout chunk (state=3): >>>import '_collections' # <<< 7557 1726882076.62645: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4e7b60> <<< 7557 1726882076.62657: stdout chunk (state=3): >>>import '_functools' # <<< 7557 1726882076.62691: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4e5280> <<< 7557 1726882076.62769: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cd040> <<< 7557 1726882076.62815: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 7557 1726882076.62847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 7557 1726882076.62882: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 7557 1726882076.62885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 7557 1726882076.62923: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 7557 1726882076.62945: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce527800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce526450> <<< 7557 1726882076.62980: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 7557 1726882076.62991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4e6150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce524bc0> <<< 7557 1726882076.63045: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 7557 1726882076.63063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cc2c0> <<< 7557 1726882076.63115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 7557 1726882076.63124: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce55cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55cbf0><<< 7557 1726882076.63176: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce55cfe0> <<< 7557 1726882076.63194: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cade0> <<< 7557 1726882076.63225: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.63228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 7557 1726882076.63285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 7557 1726882076.63288: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55d6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55d3a0> import 'importlib.machinery' # <<< 7557 1726882076.63334: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 7557 1726882076.63340: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55e5d0> <<< 7557 1726882076.63385: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 7557 1726882076.63389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 7557 1726882076.63449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 7557 1726882076.63453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce5747d0> <<< 7557 1726882076.63467: stdout chunk (state=3): >>>import 'errno' # <<< 7557 1726882076.63506: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.63558: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce575eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 7557 1726882076.63564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 7557 1726882076.63588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce576d50> <<< 7557 1726882076.63634: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce577380> <<< 7557 1726882076.63712: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce5762a0> <<< 7557 1726882076.63774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce577e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce577530> <<< 7557 1726882076.63897: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55e540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 7557 1726882076.63901: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce2f3ce0> <<< 7557 1726882076.64078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 7557 1726882076.64082: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31c530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31c740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.64245: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31d100> <<< 7557 1726882076.64312: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31dac0> <<< 7557 1726882076.64353: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31c9b0> <<< 7557 1726882076.64379: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce2f1e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 7557 1726882076.64420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 7557 1726882076.64465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31eea0> <<< 7557 1726882076.64497: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31dbe0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55ecf0> <<< 7557 1726882076.64500: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 7557 1726882076.64573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.64668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 7557 1726882076.64702: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce343230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.64725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 7557 1726882076.64741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 7557 1726882076.64805: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce36b5c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 7557 1726882076.64839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 7557 1726882076.65023: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce3cc3b0> <<< 7557 1726882076.65041: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 7557 1726882076.65118: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce3ceb10> <<< 7557 1726882076.65188: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce3cc4d0> <<< 7557 1726882076.65269: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce38d3d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd153d0> <<< 7557 1726882076.65390: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce36a3c0> <<< 7557 1726882076.65409: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31fe00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbcce36a9c0> <<< 7557 1726882076.65713: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_tl626m8d/ansible_stat_payload.zip' # zipimport: zlib available <<< 7557 1726882076.65841: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.65956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 7557 1726882076.65983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 7557 1726882076.66011: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 7557 1726882076.66042: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd6b080> import '_typing' # <<< 7557 1726882076.66208: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd49f70> <<< 7557 1726882076.66222: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd49100> # zipimport: zlib available <<< 7557 1726882076.66254: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 7557 1726882076.66304: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 7557 1726882076.67701: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.68778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd68f20> <<< 7557 1726882076.68957: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.68961: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 7557 1726882076.69028: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdd92ae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd92870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd92180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd92bd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd6baa0> import 'atexit' # <<< 7557 1726882076.69053: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdd93890> <<< 7557 1726882076.69082: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdd93ad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 7557 1726882076.69155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 7557 1726882076.69211: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd93f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 7557 1726882076.69261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 7557 1726882076.69267: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc01d90> <<< 7557 1726882076.69332: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc03980> <<< 7557 1726882076.69386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 7557 1726882076.69410: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc04320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 7557 1726882076.69455: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc054c0> <<< 7557 1726882076.69492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 7557 1726882076.69497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 7557 1726882076.69522: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 7557 1726882076.69560: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc07f20> <<< 7557 1726882076.69606: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce576bd0> <<< 7557 1726882076.69635: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc06210> <<< 7557 1726882076.69653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 7557 1726882076.69685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 7557 1726882076.69701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 7557 1726882076.69765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 7557 1726882076.69768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0fe00> <<< 7557 1726882076.69796: stdout chunk (state=3): >>>import '_tokenize' # <<< 7557 1726882076.69858: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0e8d0> <<< 7557 1726882076.69862: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0e660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 7557 1726882076.69887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 7557 1726882076.69945: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0eba0> <<< 7557 1726882076.69974: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc066f0> <<< 7557 1726882076.70035: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc57a40> <<< 7557 1726882076.70042: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 7557 1726882076.70065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc58140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 7557 1726882076.70097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 7557 1726882076.70139: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc59be0> <<< 7557 1726882076.70150: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc599a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 7557 1726882076.70250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 7557 1726882076.70310: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.70334: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc5c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc5a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 7557 1726882076.70383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.70399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 7557 1726882076.70446: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc5f8f0> <<< 7557 1726882076.70558: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc5c2c0> <<< 7557 1726882076.70621: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc60c50> <<< 7557 1726882076.70648: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc60ce0> <<< 7557 1726882076.70706: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc60a70> <<< 7557 1726882076.70734: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc58260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 7557 1726882076.70748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 7557 1726882076.70784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 7557 1726882076.70798: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.70826: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdaec200> <<< 7557 1726882076.70971: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.70995: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdaed310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc62990> <<< 7557 1726882076.71125: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc63d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc625d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 7557 1726882076.71398: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.71445: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.71510: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.72031: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.72673: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 7557 1726882076.72680: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 7557 1726882076.72695: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 7557 1726882076.72713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.72775: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 7557 1726882076.72780: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdaf54c0> <<< 7557 1726882076.72985: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 7557 1726882076.72989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf6210> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaed5e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 7557 1726882076.73009: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 7557 1726882076.73231: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.73455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 7557 1726882076.73473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf6300> <<< 7557 1726882076.73576: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.74230: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.74955: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.75399: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.75403: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 7557 1726882076.75507: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 7557 1726882076.75617: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 7557 1726882076.75800: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.76061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 7557 1726882076.76121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 7557 1726882076.76137: stdout chunk (state=3): >>>import '_ast' # <<< 7557 1726882076.76210: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf7590> # zipimport: zlib available <<< 7557 1726882076.76297: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.76381: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 7557 1726882076.76401: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 7557 1726882076.76425: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.76455: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.76510: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 7557 1726882076.76520: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.76549: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.76590: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.76815: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.76926: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdb023f0> <<< 7557 1726882076.76983: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdafcda0> <<< 7557 1726882076.77032: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 7557 1726882076.77126: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.77213: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.77251: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.77308: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 7557 1726882076.77314: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 7557 1726882076.77335: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 7557 1726882076.77402: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 7557 1726882076.77512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 7557 1726882076.77535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 7557 1726882076.77636: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdbeeb70> <<< 7557 1726882076.77783: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccddca870> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdb02120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf48f0> <<< 7557 1726882076.77812: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 7557 1726882076.77859: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.77884: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 7557 1726882076.77995: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 7557 1726882076.78029: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 7557 1726882076.78268: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.78597: stdout chunk (state=3): >>># zipimport: zlib available <<< 7557 1726882076.78705: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 7557 1726882076.78730: stdout chunk (state=3): >>># destroy __main__ <<< 7557 1726882076.79336: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib<<< 7557 1726882076.79417: stdout chunk (state=3): >>> # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _po<<< 7557 1726882076.79421: stdout chunk (state=3): >>>sixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 7557 1726882076.79649: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7557 1726882076.79690: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 7557 1726882076.79703: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 7557 1726882076.79744: stdout chunk (state=3): >>># destroy ntpath <<< 7557 1726882076.79807: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 7557 1726882076.79814: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 7557 1726882076.79850: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 7557 1726882076.79886: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux <<< 7557 1726882076.79904: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 7557 1726882076.79951: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 7557 1726882076.80021: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 7557 1726882076.80044: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 7557 1726882076.80125: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 7557 1726882076.80152: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 7557 1726882076.80583: stdout chunk (state=3): >>># destroy sys.monitoring <<< 7557 1726882076.80746: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 7557 1726882076.80751: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 7557 1726882076.81033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882076.81059: stderr chunk (state=3): >>><<< 7557 1726882076.81062: stdout chunk (state=3): >>><<< 7557 1726882076.81132: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce6dc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce6abb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce6dea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce491130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce491fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cfe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce5078c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce507f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4e7b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4e5280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cd040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce527800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce526450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4e6150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce524bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55c890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cc2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce55cd40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55cbf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce55cfe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce4cade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55d6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55d3a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55e5d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce5747d0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce575eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce576d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce577380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce5762a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce577e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce577530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55e540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce2f3ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31c530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31c740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31d100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce31dac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31c9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce2f1e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31eea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31dbe0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce55ecf0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce343230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce36b5c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce3cc3b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce3ceb10> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce3cc4d0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce38d3d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd153d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce36a3c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbcce31fe00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbcce36a9c0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_tl626m8d/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd6b080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd49f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd49100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd68f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdd92ae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd92870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd92180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd92bd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd6baa0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdd93890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdd93ad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdd93f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc01d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc03980> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc04320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc054c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc07f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbcce576bd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc06210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0fe00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0e8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0e660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc0eba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc066f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc57a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc58140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc59be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc599a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc5c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc5a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc5f8f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc5c2c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc60c50> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc60ce0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc60a70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc58260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdaec200> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdaed310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc62990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdc63d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdc625d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdaf54c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf6210> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaed5e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf6300> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf7590> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbccdb023f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdafcda0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdbeeb70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccddca870> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdb02120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbccdaf48f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 7557 1726882076.81661: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882076.81663: _low_level_execute_command(): starting 7557 1726882076.81666: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882076.3639536-7708-43935943869073/ > /dev/null 2>&1 && sleep 0' 7557 1726882076.81766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882076.81769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882076.81772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.81778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882076.81787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882076.81844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882076.81847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882076.81849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882076.81896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882076.84448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882076.84472: stderr chunk (state=3): >>><<< 7557 1726882076.84475: stdout chunk (state=3): >>><<< 7557 1726882076.84492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882076.84497: handler run complete 7557 1726882076.84512: attempt loop complete, returning result 7557 1726882076.84515: _execute() done 7557 1726882076.84519: dumping result to json 7557 1726882076.84521: done dumping result, returning 7557 1726882076.84531: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [12673a56-9f93-ed48-b3a5-000000000168] 7557 1726882076.84533: sending task result for task 12673a56-9f93-ed48-b3a5-000000000168 7557 1726882076.84621: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000168 7557 1726882076.84624: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7557 1726882076.84716: no more pending results, returning what we have 7557 1726882076.84719: results queue empty 7557 1726882076.84719: checking for any_errors_fatal 7557 1726882076.84726: done checking for any_errors_fatal 7557 1726882076.84727: checking for max_fail_percentage 7557 1726882076.84728: done checking for max_fail_percentage 7557 1726882076.84729: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.84729: done checking to see if all hosts have failed 7557 1726882076.84730: getting the remaining hosts for this loop 7557 1726882076.84731: done getting the remaining hosts for this loop 7557 1726882076.84734: getting the next task for host managed_node3 7557 1726882076.84741: done getting next task for host managed_node3 7557 1726882076.84743: ^ task is: TASK: Set flag to indicate system is ostree 7557 1726882076.84746: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.84749: getting variables 7557 1726882076.84751: in VariableManager get_vars() 7557 1726882076.84779: Calling all_inventory to load vars for managed_node3 7557 1726882076.84782: Calling groups_inventory to load vars for managed_node3 7557 1726882076.84785: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.84799: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.84802: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.84805: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.84953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.85070: done with get_vars() 7557 1726882076.85078: done getting variables 7557 1726882076.85151: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:27:56 -0400 (0:00:00.538) 0:00:02.704 ****** 7557 1726882076.85171: entering _queue_task() for managed_node3/set_fact 7557 1726882076.85172: Creating lock for set_fact 7557 1726882076.85373: worker is 1 (out of 1 available) 7557 1726882076.85387: exiting _queue_task() for managed_node3/set_fact 7557 1726882076.85399: done queuing things up, now waiting for results queue to drain 7557 1726882076.85401: waiting for pending results... 7557 1726882076.85540: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 7557 1726882076.85607: in run() - task 12673a56-9f93-ed48-b3a5-000000000169 7557 1726882076.85620: variable 'ansible_search_path' from source: unknown 7557 1726882076.85625: variable 'ansible_search_path' from source: unknown 7557 1726882076.85653: calling self._execute() 7557 1726882076.85711: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.85715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.85724: variable 'omit' from source: magic vars 7557 1726882076.86107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882076.86267: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882076.86305: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882076.86331: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882076.86354: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882076.86422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882076.86440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882076.86457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882076.86474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882076.86564: Evaluated conditional (not __network_is_ostree is defined): True 7557 1726882076.86568: variable 'omit' from source: magic vars 7557 1726882076.86591: variable 'omit' from source: magic vars 7557 1726882076.86674: variable '__ostree_booted_stat' from source: set_fact 7557 1726882076.86714: variable 'omit' from source: magic vars 7557 1726882076.86737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882076.86756: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882076.86770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882076.86782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882076.86791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882076.86817: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882076.86820: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.86823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.86890: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882076.86900: Set connection var ansible_shell_executable to /bin/sh 7557 1726882076.86903: Set connection var ansible_shell_type to sh 7557 1726882076.86908: Set connection var ansible_pipelining to False 7557 1726882076.86910: Set connection var ansible_connection to ssh 7557 1726882076.86915: Set connection var ansible_timeout to 10 7557 1726882076.86930: variable 'ansible_shell_executable' from source: unknown 7557 1726882076.86935: variable 'ansible_connection' from source: unknown 7557 1726882076.86938: variable 'ansible_module_compression' from source: unknown 7557 1726882076.86940: variable 'ansible_shell_type' from source: unknown 7557 1726882076.86944: variable 'ansible_shell_executable' from source: unknown 7557 1726882076.86946: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.86948: variable 'ansible_pipelining' from source: unknown 7557 1726882076.86950: variable 'ansible_timeout' from source: unknown 7557 1726882076.86952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.87023: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882076.87031: variable 'omit' from source: magic vars 7557 1726882076.87036: starting attempt loop 7557 1726882076.87038: running the handler 7557 1726882076.87047: handler run complete 7557 1726882076.87055: attempt loop complete, returning result 7557 1726882076.87058: _execute() done 7557 1726882076.87062: dumping result to json 7557 1726882076.87064: done dumping result, returning 7557 1726882076.87068: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [12673a56-9f93-ed48-b3a5-000000000169] 7557 1726882076.87079: sending task result for task 12673a56-9f93-ed48-b3a5-000000000169 7557 1726882076.87146: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000169 7557 1726882076.87149: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 7557 1726882076.87235: no more pending results, returning what we have 7557 1726882076.87237: results queue empty 7557 1726882076.87238: checking for any_errors_fatal 7557 1726882076.87243: done checking for any_errors_fatal 7557 1726882076.87244: checking for max_fail_percentage 7557 1726882076.87245: done checking for max_fail_percentage 7557 1726882076.87245: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.87246: done checking to see if all hosts have failed 7557 1726882076.87247: getting the remaining hosts for this loop 7557 1726882076.87248: done getting the remaining hosts for this loop 7557 1726882076.87251: getting the next task for host managed_node3 7557 1726882076.87258: done getting next task for host managed_node3 7557 1726882076.87260: ^ task is: TASK: Fix CentOS6 Base repo 7557 1726882076.87263: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.87266: getting variables 7557 1726882076.87267: in VariableManager get_vars() 7557 1726882076.87295: Calling all_inventory to load vars for managed_node3 7557 1726882076.87297: Calling groups_inventory to load vars for managed_node3 7557 1726882076.87300: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.87309: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.87311: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.87319: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.87464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.87575: done with get_vars() 7557 1726882076.87582: done getting variables 7557 1726882076.87667: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:27:56 -0400 (0:00:00.025) 0:00:02.729 ****** 7557 1726882076.87688: entering _queue_task() for managed_node3/copy 7557 1726882076.87868: worker is 1 (out of 1 available) 7557 1726882076.87880: exiting _queue_task() for managed_node3/copy 7557 1726882076.87892: done queuing things up, now waiting for results queue to drain 7557 1726882076.87895: waiting for pending results... 7557 1726882076.88038: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 7557 1726882076.88101: in run() - task 12673a56-9f93-ed48-b3a5-00000000016b 7557 1726882076.88112: variable 'ansible_search_path' from source: unknown 7557 1726882076.88117: variable 'ansible_search_path' from source: unknown 7557 1726882076.88144: calling self._execute() 7557 1726882076.88198: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.88203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.88212: variable 'omit' from source: magic vars 7557 1726882076.88544: variable 'ansible_distribution' from source: facts 7557 1726882076.88560: Evaluated conditional (ansible_distribution == 'CentOS'): True 7557 1726882076.88646: variable 'ansible_distribution_major_version' from source: facts 7557 1726882076.88650: Evaluated conditional (ansible_distribution_major_version == '6'): False 7557 1726882076.88652: when evaluation is False, skipping this task 7557 1726882076.88657: _execute() done 7557 1726882076.88660: dumping result to json 7557 1726882076.88662: done dumping result, returning 7557 1726882076.88673: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [12673a56-9f93-ed48-b3a5-00000000016b] 7557 1726882076.88679: sending task result for task 12673a56-9f93-ed48-b3a5-00000000016b 7557 1726882076.88761: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000016b 7557 1726882076.88764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7557 1726882076.88832: no more pending results, returning what we have 7557 1726882076.88835: results queue empty 7557 1726882076.88836: checking for any_errors_fatal 7557 1726882076.88839: done checking for any_errors_fatal 7557 1726882076.88840: checking for max_fail_percentage 7557 1726882076.88842: done checking for max_fail_percentage 7557 1726882076.88842: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.88843: done checking to see if all hosts have failed 7557 1726882076.88844: getting the remaining hosts for this loop 7557 1726882076.88845: done getting the remaining hosts for this loop 7557 1726882076.88847: getting the next task for host managed_node3 7557 1726882076.88852: done getting next task for host managed_node3 7557 1726882076.88855: ^ task is: TASK: Include the task 'enable_epel.yml' 7557 1726882076.88857: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.88860: getting variables 7557 1726882076.88861: in VariableManager get_vars() 7557 1726882076.88883: Calling all_inventory to load vars for managed_node3 7557 1726882076.88886: Calling groups_inventory to load vars for managed_node3 7557 1726882076.88888: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.88901: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.88903: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.88906: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.89010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.89144: done with get_vars() 7557 1726882076.89150: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:27:56 -0400 (0:00:00.015) 0:00:02.745 ****** 7557 1726882076.89211: entering _queue_task() for managed_node3/include_tasks 7557 1726882076.89392: worker is 1 (out of 1 available) 7557 1726882076.89405: exiting _queue_task() for managed_node3/include_tasks 7557 1726882076.89415: done queuing things up, now waiting for results queue to drain 7557 1726882076.89416: waiting for pending results... 7557 1726882076.89554: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 7557 1726882076.89606: in run() - task 12673a56-9f93-ed48-b3a5-00000000016c 7557 1726882076.89618: variable 'ansible_search_path' from source: unknown 7557 1726882076.89621: variable 'ansible_search_path' from source: unknown 7557 1726882076.89652: calling self._execute() 7557 1726882076.89705: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.89709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.89718: variable 'omit' from source: magic vars 7557 1726882076.90049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882076.91606: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882076.91650: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882076.91685: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882076.91712: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882076.91735: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882076.91792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882076.91813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882076.91840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882076.91861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882076.91872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882076.91954: variable '__network_is_ostree' from source: set_fact 7557 1726882076.91968: Evaluated conditional (not __network_is_ostree | d(false)): True 7557 1726882076.91973: _execute() done 7557 1726882076.91976: dumping result to json 7557 1726882076.91978: done dumping result, returning 7557 1726882076.91984: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-ed48-b3a5-00000000016c] 7557 1726882076.91992: sending task result for task 12673a56-9f93-ed48-b3a5-00000000016c 7557 1726882076.92074: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000016c 7557 1726882076.92077: WORKER PROCESS EXITING 7557 1726882076.92111: no more pending results, returning what we have 7557 1726882076.92116: in VariableManager get_vars() 7557 1726882076.92149: Calling all_inventory to load vars for managed_node3 7557 1726882076.92151: Calling groups_inventory to load vars for managed_node3 7557 1726882076.92154: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.92164: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.92167: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.92169: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.92322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.92432: done with get_vars() 7557 1726882076.92439: variable 'ansible_search_path' from source: unknown 7557 1726882076.92439: variable 'ansible_search_path' from source: unknown 7557 1726882076.92464: we have included files to process 7557 1726882076.92465: generating all_blocks data 7557 1726882076.92466: done generating all_blocks data 7557 1726882076.92470: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7557 1726882076.92471: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7557 1726882076.92473: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7557 1726882076.93054: done processing included file 7557 1726882076.93055: iterating over new_blocks loaded from include file 7557 1726882076.93056: in VariableManager get_vars() 7557 1726882076.93064: done with get_vars() 7557 1726882076.93065: filtering new block on tags 7557 1726882076.93079: done filtering new block on tags 7557 1726882076.93080: in VariableManager get_vars() 7557 1726882076.93086: done with get_vars() 7557 1726882076.93087: filtering new block on tags 7557 1726882076.93105: done filtering new block on tags 7557 1726882076.93107: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 7557 1726882076.93111: extending task lists for all hosts with included blocks 7557 1726882076.93171: done extending task lists 7557 1726882076.93172: done processing included files 7557 1726882076.93173: results queue empty 7557 1726882076.93173: checking for any_errors_fatal 7557 1726882076.93175: done checking for any_errors_fatal 7557 1726882076.93176: checking for max_fail_percentage 7557 1726882076.93176: done checking for max_fail_percentage 7557 1726882076.93177: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.93177: done checking to see if all hosts have failed 7557 1726882076.93177: getting the remaining hosts for this loop 7557 1726882076.93178: done getting the remaining hosts for this loop 7557 1726882076.93179: getting the next task for host managed_node3 7557 1726882076.93182: done getting next task for host managed_node3 7557 1726882076.93184: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 7557 1726882076.93185: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.93187: getting variables 7557 1726882076.93187: in VariableManager get_vars() 7557 1726882076.93195: Calling all_inventory to load vars for managed_node3 7557 1726882076.93197: Calling groups_inventory to load vars for managed_node3 7557 1726882076.93198: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.93202: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.93207: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.93209: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.93300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.93410: done with get_vars() 7557 1726882076.93416: done getting variables 7557 1726882076.93460: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 7557 1726882076.93598: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:27:56 -0400 (0:00:00.044) 0:00:02.789 ****** 7557 1726882076.93630: entering _queue_task() for managed_node3/command 7557 1726882076.93631: Creating lock for command 7557 1726882076.93840: worker is 1 (out of 1 available) 7557 1726882076.93852: exiting _queue_task() for managed_node3/command 7557 1726882076.93863: done queuing things up, now waiting for results queue to drain 7557 1726882076.93864: waiting for pending results... 7557 1726882076.94029: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 7557 1726882076.94114: in run() - task 12673a56-9f93-ed48-b3a5-000000000186 7557 1726882076.94123: variable 'ansible_search_path' from source: unknown 7557 1726882076.94127: variable 'ansible_search_path' from source: unknown 7557 1726882076.94155: calling self._execute() 7557 1726882076.94215: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.94219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.94228: variable 'omit' from source: magic vars 7557 1726882076.94500: variable 'ansible_distribution' from source: facts 7557 1726882076.94510: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7557 1726882076.94591: variable 'ansible_distribution_major_version' from source: facts 7557 1726882076.94602: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7557 1726882076.94606: when evaluation is False, skipping this task 7557 1726882076.94608: _execute() done 7557 1726882076.94611: dumping result to json 7557 1726882076.94613: done dumping result, returning 7557 1726882076.94619: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [12673a56-9f93-ed48-b3a5-000000000186] 7557 1726882076.94624: sending task result for task 12673a56-9f93-ed48-b3a5-000000000186 7557 1726882076.94717: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000186 7557 1726882076.94720: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7557 1726882076.94782: no more pending results, returning what we have 7557 1726882076.94785: results queue empty 7557 1726882076.94786: checking for any_errors_fatal 7557 1726882076.94787: done checking for any_errors_fatal 7557 1726882076.94787: checking for max_fail_percentage 7557 1726882076.94789: done checking for max_fail_percentage 7557 1726882076.94789: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.94790: done checking to see if all hosts have failed 7557 1726882076.94791: getting the remaining hosts for this loop 7557 1726882076.94792: done getting the remaining hosts for this loop 7557 1726882076.94797: getting the next task for host managed_node3 7557 1726882076.94803: done getting next task for host managed_node3 7557 1726882076.94805: ^ task is: TASK: Install yum-utils package 7557 1726882076.94808: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.94811: getting variables 7557 1726882076.94812: in VariableManager get_vars() 7557 1726882076.94838: Calling all_inventory to load vars for managed_node3 7557 1726882076.94841: Calling groups_inventory to load vars for managed_node3 7557 1726882076.94843: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.94852: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.94855: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.94857: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.94971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.95084: done with get_vars() 7557 1726882076.95092: done getting variables 7557 1726882076.95164: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:27:56 -0400 (0:00:00.015) 0:00:02.804 ****** 7557 1726882076.95184: entering _queue_task() for managed_node3/package 7557 1726882076.95185: Creating lock for package 7557 1726882076.95426: worker is 1 (out of 1 available) 7557 1726882076.95438: exiting _queue_task() for managed_node3/package 7557 1726882076.95449: done queuing things up, now waiting for results queue to drain 7557 1726882076.95451: waiting for pending results... 7557 1726882076.95913: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 7557 1726882076.95918: in run() - task 12673a56-9f93-ed48-b3a5-000000000187 7557 1726882076.95921: variable 'ansible_search_path' from source: unknown 7557 1726882076.95924: variable 'ansible_search_path' from source: unknown 7557 1726882076.95927: calling self._execute() 7557 1726882076.96046: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.96061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.96074: variable 'omit' from source: magic vars 7557 1726882076.96499: variable 'ansible_distribution' from source: facts 7557 1726882076.96511: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7557 1726882076.96606: variable 'ansible_distribution_major_version' from source: facts 7557 1726882076.96610: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7557 1726882076.96613: when evaluation is False, skipping this task 7557 1726882076.96616: _execute() done 7557 1726882076.96618: dumping result to json 7557 1726882076.96621: done dumping result, returning 7557 1726882076.96625: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [12673a56-9f93-ed48-b3a5-000000000187] 7557 1726882076.96631: sending task result for task 12673a56-9f93-ed48-b3a5-000000000187 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7557 1726882076.96829: no more pending results, returning what we have 7557 1726882076.96832: results queue empty 7557 1726882076.96833: checking for any_errors_fatal 7557 1726882076.96836: done checking for any_errors_fatal 7557 1726882076.96837: checking for max_fail_percentage 7557 1726882076.96838: done checking for max_fail_percentage 7557 1726882076.96839: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.96840: done checking to see if all hosts have failed 7557 1726882076.96840: getting the remaining hosts for this loop 7557 1726882076.96841: done getting the remaining hosts for this loop 7557 1726882076.96844: getting the next task for host managed_node3 7557 1726882076.96849: done getting next task for host managed_node3 7557 1726882076.96850: ^ task is: TASK: Enable EPEL 7 7557 1726882076.96854: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.96856: getting variables 7557 1726882076.96858: in VariableManager get_vars() 7557 1726882076.96877: Calling all_inventory to load vars for managed_node3 7557 1726882076.96879: Calling groups_inventory to load vars for managed_node3 7557 1726882076.96881: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.96887: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000187 7557 1726882076.96890: WORKER PROCESS EXITING 7557 1726882076.96899: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.96901: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.96904: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.97005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.97125: done with get_vars() 7557 1726882076.97132: done getting variables 7557 1726882076.97173: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:27:56 -0400 (0:00:00.020) 0:00:02.824 ****** 7557 1726882076.97195: entering _queue_task() for managed_node3/command 7557 1726882076.97382: worker is 1 (out of 1 available) 7557 1726882076.97397: exiting _queue_task() for managed_node3/command 7557 1726882076.97408: done queuing things up, now waiting for results queue to drain 7557 1726882076.97410: waiting for pending results... 7557 1726882076.97544: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 7557 1726882076.97620: in run() - task 12673a56-9f93-ed48-b3a5-000000000188 7557 1726882076.97630: variable 'ansible_search_path' from source: unknown 7557 1726882076.97634: variable 'ansible_search_path' from source: unknown 7557 1726882076.97665: calling self._execute() 7557 1726882076.97724: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.97730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.97737: variable 'omit' from source: magic vars 7557 1726882076.98002: variable 'ansible_distribution' from source: facts 7557 1726882076.98012: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7557 1726882076.98100: variable 'ansible_distribution_major_version' from source: facts 7557 1726882076.98106: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7557 1726882076.98109: when evaluation is False, skipping this task 7557 1726882076.98112: _execute() done 7557 1726882076.98114: dumping result to json 7557 1726882076.98117: done dumping result, returning 7557 1726882076.98123: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [12673a56-9f93-ed48-b3a5-000000000188] 7557 1726882076.98128: sending task result for task 12673a56-9f93-ed48-b3a5-000000000188 7557 1726882076.98210: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000188 7557 1726882076.98213: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7557 1726882076.98260: no more pending results, returning what we have 7557 1726882076.98263: results queue empty 7557 1726882076.98264: checking for any_errors_fatal 7557 1726882076.98269: done checking for any_errors_fatal 7557 1726882076.98270: checking for max_fail_percentage 7557 1726882076.98271: done checking for max_fail_percentage 7557 1726882076.98271: checking to see if all hosts have failed and the running result is not ok 7557 1726882076.98272: done checking to see if all hosts have failed 7557 1726882076.98273: getting the remaining hosts for this loop 7557 1726882076.98274: done getting the remaining hosts for this loop 7557 1726882076.98277: getting the next task for host managed_node3 7557 1726882076.98283: done getting next task for host managed_node3 7557 1726882076.98285: ^ task is: TASK: Enable EPEL 8 7557 1726882076.98289: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882076.98296: getting variables 7557 1726882076.98297: in VariableManager get_vars() 7557 1726882076.98327: Calling all_inventory to load vars for managed_node3 7557 1726882076.98330: Calling groups_inventory to load vars for managed_node3 7557 1726882076.98333: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882076.98341: Calling all_plugins_play to load vars for managed_node3 7557 1726882076.98344: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882076.98346: Calling groups_plugins_play to load vars for managed_node3 7557 1726882076.98504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882076.98848: done with get_vars() 7557 1726882076.98857: done getting variables 7557 1726882076.98913: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:27:56 -0400 (0:00:00.017) 0:00:02.842 ****** 7557 1726882076.98941: entering _queue_task() for managed_node3/command 7557 1726882076.99395: worker is 1 (out of 1 available) 7557 1726882076.99402: exiting _queue_task() for managed_node3/command 7557 1726882076.99411: done queuing things up, now waiting for results queue to drain 7557 1726882076.99413: waiting for pending results... 7557 1726882076.99511: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 7557 1726882076.99517: in run() - task 12673a56-9f93-ed48-b3a5-000000000189 7557 1726882076.99600: variable 'ansible_search_path' from source: unknown 7557 1726882076.99604: variable 'ansible_search_path' from source: unknown 7557 1726882076.99606: calling self._execute() 7557 1726882076.99656: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882076.99668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882076.99682: variable 'omit' from source: magic vars 7557 1726882077.00043: variable 'ansible_distribution' from source: facts 7557 1726882077.00060: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7557 1726882077.00197: variable 'ansible_distribution_major_version' from source: facts 7557 1726882077.00209: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7557 1726882077.00217: when evaluation is False, skipping this task 7557 1726882077.00224: _execute() done 7557 1726882077.00231: dumping result to json 7557 1726882077.00239: done dumping result, returning 7557 1726882077.00249: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [12673a56-9f93-ed48-b3a5-000000000189] 7557 1726882077.00289: sending task result for task 12673a56-9f93-ed48-b3a5-000000000189 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7557 1726882077.00439: no more pending results, returning what we have 7557 1726882077.00443: results queue empty 7557 1726882077.00444: checking for any_errors_fatal 7557 1726882077.00451: done checking for any_errors_fatal 7557 1726882077.00452: checking for max_fail_percentage 7557 1726882077.00454: done checking for max_fail_percentage 7557 1726882077.00454: checking to see if all hosts have failed and the running result is not ok 7557 1726882077.00455: done checking to see if all hosts have failed 7557 1726882077.00456: getting the remaining hosts for this loop 7557 1726882077.00457: done getting the remaining hosts for this loop 7557 1726882077.00460: getting the next task for host managed_node3 7557 1726882077.00471: done getting next task for host managed_node3 7557 1726882077.00473: ^ task is: TASK: Enable EPEL 6 7557 1726882077.00477: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882077.00481: getting variables 7557 1726882077.00483: in VariableManager get_vars() 7557 1726882077.00518: Calling all_inventory to load vars for managed_node3 7557 1726882077.00521: Calling groups_inventory to load vars for managed_node3 7557 1726882077.00525: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882077.00539: Calling all_plugins_play to load vars for managed_node3 7557 1726882077.00542: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882077.00545: Calling groups_plugins_play to load vars for managed_node3 7557 1726882077.00881: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000189 7557 1726882077.00885: WORKER PROCESS EXITING 7557 1726882077.00909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882077.01106: done with get_vars() 7557 1726882077.01116: done getting variables 7557 1726882077.01176: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:27:57 -0400 (0:00:00.022) 0:00:02.865 ****** 7557 1726882077.01206: entering _queue_task() for managed_node3/copy 7557 1726882077.01423: worker is 1 (out of 1 available) 7557 1726882077.01437: exiting _queue_task() for managed_node3/copy 7557 1726882077.01449: done queuing things up, now waiting for results queue to drain 7557 1726882077.01451: waiting for pending results... 7557 1726882077.01619: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 7557 1726882077.01687: in run() - task 12673a56-9f93-ed48-b3a5-00000000018b 7557 1726882077.01699: variable 'ansible_search_path' from source: unknown 7557 1726882077.01703: variable 'ansible_search_path' from source: unknown 7557 1726882077.01730: calling self._execute() 7557 1726882077.01779: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882077.01798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882077.01801: variable 'omit' from source: magic vars 7557 1726882077.02050: variable 'ansible_distribution' from source: facts 7557 1726882077.02059: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7557 1726882077.02138: variable 'ansible_distribution_major_version' from source: facts 7557 1726882077.02142: Evaluated conditional (ansible_distribution_major_version == '6'): False 7557 1726882077.02145: when evaluation is False, skipping this task 7557 1726882077.02148: _execute() done 7557 1726882077.02150: dumping result to json 7557 1726882077.02153: done dumping result, returning 7557 1726882077.02230: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [12673a56-9f93-ed48-b3a5-00000000018b] 7557 1726882077.02233: sending task result for task 12673a56-9f93-ed48-b3a5-00000000018b 7557 1726882077.02301: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000018b 7557 1726882077.02303: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7557 1726882077.02337: no more pending results, returning what we have 7557 1726882077.02340: results queue empty 7557 1726882077.02340: checking for any_errors_fatal 7557 1726882077.02344: done checking for any_errors_fatal 7557 1726882077.02345: checking for max_fail_percentage 7557 1726882077.02346: done checking for max_fail_percentage 7557 1726882077.02347: checking to see if all hosts have failed and the running result is not ok 7557 1726882077.02348: done checking to see if all hosts have failed 7557 1726882077.02348: getting the remaining hosts for this loop 7557 1726882077.02349: done getting the remaining hosts for this loop 7557 1726882077.02352: getting the next task for host managed_node3 7557 1726882077.02357: done getting next task for host managed_node3 7557 1726882077.02359: ^ task is: TASK: Set network provider to 'nm' 7557 1726882077.02361: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882077.02363: getting variables 7557 1726882077.02364: in VariableManager get_vars() 7557 1726882077.02381: Calling all_inventory to load vars for managed_node3 7557 1726882077.02383: Calling groups_inventory to load vars for managed_node3 7557 1726882077.02385: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882077.02395: Calling all_plugins_play to load vars for managed_node3 7557 1726882077.02397: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882077.02399: Calling groups_plugins_play to load vars for managed_node3 7557 1726882077.02532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882077.02643: done with get_vars() 7557 1726882077.02649: done getting variables 7557 1726882077.02692: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:13 Friday 20 September 2024 21:27:57 -0400 (0:00:00.015) 0:00:02.880 ****** 7557 1726882077.02711: entering _queue_task() for managed_node3/set_fact 7557 1726882077.02883: worker is 1 (out of 1 available) 7557 1726882077.02899: exiting _queue_task() for managed_node3/set_fact 7557 1726882077.02909: done queuing things up, now waiting for results queue to drain 7557 1726882077.02911: waiting for pending results... 7557 1726882077.03136: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 7557 1726882077.03143: in run() - task 12673a56-9f93-ed48-b3a5-000000000007 7557 1726882077.03147: variable 'ansible_search_path' from source: unknown 7557 1726882077.03332: calling self._execute() 7557 1726882077.03336: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882077.03338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882077.03340: variable 'omit' from source: magic vars 7557 1726882077.03598: variable 'omit' from source: magic vars 7557 1726882077.03601: variable 'omit' from source: magic vars 7557 1726882077.03603: variable 'omit' from source: magic vars 7557 1726882077.03605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882077.03608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882077.03609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882077.03611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882077.03613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882077.03615: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882077.03633: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882077.03640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882077.03758: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882077.03771: Set connection var ansible_shell_executable to /bin/sh 7557 1726882077.03777: Set connection var ansible_shell_type to sh 7557 1726882077.03786: Set connection var ansible_pipelining to False 7557 1726882077.03798: Set connection var ansible_connection to ssh 7557 1726882077.03808: Set connection var ansible_timeout to 10 7557 1726882077.03846: variable 'ansible_shell_executable' from source: unknown 7557 1726882077.03869: variable 'ansible_connection' from source: unknown 7557 1726882077.03883: variable 'ansible_module_compression' from source: unknown 7557 1726882077.03956: variable 'ansible_shell_type' from source: unknown 7557 1726882077.03959: variable 'ansible_shell_executable' from source: unknown 7557 1726882077.03962: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882077.03964: variable 'ansible_pipelining' from source: unknown 7557 1726882077.03966: variable 'ansible_timeout' from source: unknown 7557 1726882077.03968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882077.04127: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882077.04136: variable 'omit' from source: magic vars 7557 1726882077.04139: starting attempt loop 7557 1726882077.04144: running the handler 7557 1726882077.04153: handler run complete 7557 1726882077.04162: attempt loop complete, returning result 7557 1726882077.04171: _execute() done 7557 1726882077.04174: dumping result to json 7557 1726882077.04177: done dumping result, returning 7557 1726882077.04184: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [12673a56-9f93-ed48-b3a5-000000000007] 7557 1726882077.04188: sending task result for task 12673a56-9f93-ed48-b3a5-000000000007 7557 1726882077.04270: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000007 7557 1726882077.04275: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 7557 1726882077.04327: no more pending results, returning what we have 7557 1726882077.04330: results queue empty 7557 1726882077.04331: checking for any_errors_fatal 7557 1726882077.04337: done checking for any_errors_fatal 7557 1726882077.04338: checking for max_fail_percentage 7557 1726882077.04339: done checking for max_fail_percentage 7557 1726882077.04340: checking to see if all hosts have failed and the running result is not ok 7557 1726882077.04341: done checking to see if all hosts have failed 7557 1726882077.04341: getting the remaining hosts for this loop 7557 1726882077.04343: done getting the remaining hosts for this loop 7557 1726882077.04346: getting the next task for host managed_node3 7557 1726882077.04351: done getting next task for host managed_node3 7557 1726882077.04353: ^ task is: TASK: meta (flush_handlers) 7557 1726882077.04355: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882077.04358: getting variables 7557 1726882077.04359: in VariableManager get_vars() 7557 1726882077.04383: Calling all_inventory to load vars for managed_node3 7557 1726882077.04387: Calling groups_inventory to load vars for managed_node3 7557 1726882077.04390: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882077.04399: Calling all_plugins_play to load vars for managed_node3 7557 1726882077.04402: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882077.04404: Calling groups_plugins_play to load vars for managed_node3 7557 1726882077.04524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882077.04632: done with get_vars() 7557 1726882077.04639: done getting variables 7557 1726882077.04682: in VariableManager get_vars() 7557 1726882077.04688: Calling all_inventory to load vars for managed_node3 7557 1726882077.04689: Calling groups_inventory to load vars for managed_node3 7557 1726882077.04691: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882077.04696: Calling all_plugins_play to load vars for managed_node3 7557 1726882077.04698: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882077.04699: Calling groups_plugins_play to load vars for managed_node3 7557 1726882077.04779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882077.04906: done with get_vars() 7557 1726882077.04915: done queuing things up, now waiting for results queue to drain 7557 1726882077.04916: results queue empty 7557 1726882077.04917: checking for any_errors_fatal 7557 1726882077.04918: done checking for any_errors_fatal 7557 1726882077.04919: checking for max_fail_percentage 7557 1726882077.04919: done checking for max_fail_percentage 7557 1726882077.04920: checking to see if all hosts have failed and the running result is not ok 7557 1726882077.04920: done checking to see if all hosts have failed 7557 1726882077.04921: getting the remaining hosts for this loop 7557 1726882077.04921: done getting the remaining hosts for this loop 7557 1726882077.04923: getting the next task for host managed_node3 7557 1726882077.04925: done getting next task for host managed_node3 7557 1726882077.04926: ^ task is: TASK: meta (flush_handlers) 7557 1726882077.04927: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882077.04934: getting variables 7557 1726882077.04935: in VariableManager get_vars() 7557 1726882077.04940: Calling all_inventory to load vars for managed_node3 7557 1726882077.04941: Calling groups_inventory to load vars for managed_node3 7557 1726882077.04943: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882077.04945: Calling all_plugins_play to load vars for managed_node3 7557 1726882077.04947: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882077.04948: Calling groups_plugins_play to load vars for managed_node3 7557 1726882077.05027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882077.05131: done with get_vars() 7557 1726882077.05137: done getting variables 7557 1726882077.05167: in VariableManager get_vars() 7557 1726882077.05172: Calling all_inventory to load vars for managed_node3 7557 1726882077.05173: Calling groups_inventory to load vars for managed_node3 7557 1726882077.05175: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882077.05177: Calling all_plugins_play to load vars for managed_node3 7557 1726882077.05179: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882077.05180: Calling groups_plugins_play to load vars for managed_node3 7557 1726882077.05263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882077.05381: done with get_vars() 7557 1726882077.05388: done queuing things up, now waiting for results queue to drain 7557 1726882077.05390: results queue empty 7557 1726882077.05390: checking for any_errors_fatal 7557 1726882077.05391: done checking for any_errors_fatal 7557 1726882077.05392: checking for max_fail_percentage 7557 1726882077.05394: done checking for max_fail_percentage 7557 1726882077.05394: checking to see if all hosts have failed and the running result is not ok 7557 1726882077.05395: done checking to see if all hosts have failed 7557 1726882077.05395: getting the remaining hosts for this loop 7557 1726882077.05396: done getting the remaining hosts for this loop 7557 1726882077.05397: getting the next task for host managed_node3 7557 1726882077.05399: done getting next task for host managed_node3 7557 1726882077.05399: ^ task is: None 7557 1726882077.05400: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882077.05401: done queuing things up, now waiting for results queue to drain 7557 1726882077.05402: results queue empty 7557 1726882077.05402: checking for any_errors_fatal 7557 1726882077.05402: done checking for any_errors_fatal 7557 1726882077.05403: checking for max_fail_percentage 7557 1726882077.05403: done checking for max_fail_percentage 7557 1726882077.05404: checking to see if all hosts have failed and the running result is not ok 7557 1726882077.05404: done checking to see if all hosts have failed 7557 1726882077.05405: getting the next task for host managed_node3 7557 1726882077.05406: done getting next task for host managed_node3 7557 1726882077.05407: ^ task is: None 7557 1726882077.05408: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882077.05445: in VariableManager get_vars() 7557 1726882077.05467: done with get_vars() 7557 1726882077.05471: in VariableManager get_vars() 7557 1726882077.05484: done with get_vars() 7557 1726882077.05487: variable 'omit' from source: magic vars 7557 1726882077.05510: in VariableManager get_vars() 7557 1726882077.05523: done with get_vars() 7557 1726882077.05539: variable 'omit' from source: magic vars PLAY [Play for testing auto_gateway setting] *********************************** 7557 1726882077.05850: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 7557 1726882077.05871: getting the remaining hosts for this loop 7557 1726882077.05872: done getting the remaining hosts for this loop 7557 1726882077.05874: getting the next task for host managed_node3 7557 1726882077.05875: done getting next task for host managed_node3 7557 1726882077.05876: ^ task is: TASK: Gathering Facts 7557 1726882077.05877: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882077.05878: getting variables 7557 1726882077.05879: in VariableManager get_vars() 7557 1726882077.05890: Calling all_inventory to load vars for managed_node3 7557 1726882077.05891: Calling groups_inventory to load vars for managed_node3 7557 1726882077.05894: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882077.05898: Calling all_plugins_play to load vars for managed_node3 7557 1726882077.05907: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882077.05910: Calling groups_plugins_play to load vars for managed_node3 7557 1726882077.06008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882077.06184: done with get_vars() 7557 1726882077.06192: done getting variables 7557 1726882077.06228: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Friday 20 September 2024 21:27:57 -0400 (0:00:00.035) 0:00:02.915 ****** 7557 1726882077.06248: entering _queue_task() for managed_node3/gather_facts 7557 1726882077.06462: worker is 1 (out of 1 available) 7557 1726882077.06475: exiting _queue_task() for managed_node3/gather_facts 7557 1726882077.06487: done queuing things up, now waiting for results queue to drain 7557 1726882077.06489: waiting for pending results... 7557 1726882077.06907: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7557 1726882077.06912: in run() - task 12673a56-9f93-ed48-b3a5-0000000001b1 7557 1726882077.06914: variable 'ansible_search_path' from source: unknown 7557 1726882077.06917: calling self._execute() 7557 1726882077.06942: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882077.06953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882077.06964: variable 'omit' from source: magic vars 7557 1726882077.07300: variable 'ansible_distribution_major_version' from source: facts 7557 1726882077.07318: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882077.07331: variable 'omit' from source: magic vars 7557 1726882077.07359: variable 'omit' from source: magic vars 7557 1726882077.07399: variable 'omit' from source: magic vars 7557 1726882077.07442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882077.07480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882077.07580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882077.07609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882077.07620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882077.07644: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882077.07652: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882077.07655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882077.07725: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882077.07732: Set connection var ansible_shell_executable to /bin/sh 7557 1726882077.07735: Set connection var ansible_shell_type to sh 7557 1726882077.07738: Set connection var ansible_pipelining to False 7557 1726882077.07740: Set connection var ansible_connection to ssh 7557 1726882077.07755: Set connection var ansible_timeout to 10 7557 1726882077.07767: variable 'ansible_shell_executable' from source: unknown 7557 1726882077.07770: variable 'ansible_connection' from source: unknown 7557 1726882077.07773: variable 'ansible_module_compression' from source: unknown 7557 1726882077.07775: variable 'ansible_shell_type' from source: unknown 7557 1726882077.07778: variable 'ansible_shell_executable' from source: unknown 7557 1726882077.07780: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882077.07783: variable 'ansible_pipelining' from source: unknown 7557 1726882077.07785: variable 'ansible_timeout' from source: unknown 7557 1726882077.07792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882077.07917: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882077.07925: variable 'omit' from source: magic vars 7557 1726882077.07928: starting attempt loop 7557 1726882077.07931: running the handler 7557 1726882077.07944: variable 'ansible_facts' from source: unknown 7557 1726882077.07958: _low_level_execute_command(): starting 7557 1726882077.07965: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882077.08450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882077.08454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.08499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882077.08515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882077.08570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882077.10569: stdout chunk (state=3): >>>/root <<< 7557 1726882077.10668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882077.10695: stderr chunk (state=3): >>><<< 7557 1726882077.10699: stdout chunk (state=3): >>><<< 7557 1726882077.10716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882077.10728: _low_level_execute_command(): starting 7557 1726882077.10733: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669 `" && echo ansible-tmp-1726882077.107164-7757-198524217580669="` echo /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669 `" ) && sleep 0' 7557 1726882077.11147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882077.11151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.11153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882077.11162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.11214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882077.11219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882077.11267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882077.13123: stdout chunk (state=3): >>>ansible-tmp-1726882077.107164-7757-198524217580669=/root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669 <<< 7557 1726882077.13225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882077.13254: stderr chunk (state=3): >>><<< 7557 1726882077.13257: stdout chunk (state=3): >>><<< 7557 1726882077.13266: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882077.107164-7757-198524217580669=/root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882077.13288: variable 'ansible_module_compression' from source: unknown 7557 1726882077.13328: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7557 1726882077.13374: variable 'ansible_facts' from source: unknown 7557 1726882077.13505: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/AnsiballZ_setup.py 7557 1726882077.13604: Sending initial data 7557 1726882077.13607: Sent initial data (151 bytes) 7557 1726882077.14029: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882077.14032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.14035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882077.14037: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.14083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882077.14086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882077.14139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882077.16158: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7557 1726882077.16163: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882077.16211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882077.16263: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsawfj0s9 /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/AnsiballZ_setup.py <<< 7557 1726882077.16266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/AnsiballZ_setup.py" <<< 7557 1726882077.16326: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsawfj0s9" to remote "/root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/AnsiballZ_setup.py" <<< 7557 1726882077.16329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/AnsiballZ_setup.py" <<< 7557 1726882077.17419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882077.17457: stderr chunk (state=3): >>><<< 7557 1726882077.17461: stdout chunk (state=3): >>><<< 7557 1726882077.17477: done transferring module to remote 7557 1726882077.17494: _low_level_execute_command(): starting 7557 1726882077.17498: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/ /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/AnsiballZ_setup.py && sleep 0' 7557 1726882077.17929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882077.17933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882077.17936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.17938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882077.17940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.17987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882077.17990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882077.18046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882077.20429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882077.20451: stderr chunk (state=3): >>><<< 7557 1726882077.20454: stdout chunk (state=3): >>><<< 7557 1726882077.20466: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882077.20469: _low_level_execute_command(): starting 7557 1726882077.20474: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/AnsiballZ_setup.py && sleep 0' 7557 1726882077.20903: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882077.20906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.20909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882077.20911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882077.20913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.20959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882077.20967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882077.21019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882077.96968: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.0068359375, "5m": 0.1484375, "15m": 0.0927734375}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "27", "second": "57", "epoch": "1726882077", "epoch_int": "1726882077", "date": "2024-09-20", "time": "21:27:57", "iso8601_micro": "2024-09-21T01:27:57.580773Z", "iso8601": "2024-09-21T01:27:57Z", "iso8601_basic": "20240920T212757580773", "iso8601_basic_short": "20240920T212757", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "t<<< 7557 1726882077.96983: stdout chunk (state=3): >>>x_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["<<< 7557 1726882077.96995: stdout chunk (state=3): >>>10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3026, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 505, "free": 3026}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 384, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mo<<< 7557 1726882077.97070: stdout chunk (state=3): >>>unt": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261814886400, "block_size": 4096, "block_total": 65519099, "block_available": 63919650, "block_used": 1599449, "inode_total": 131070960, "inode_available": 131029179, "inode_used": 41781, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7557 1726882077.99219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882077.99251: stderr chunk (state=3): >>><<< 7557 1726882077.99255: stdout chunk (state=3): >>><<< 7557 1726882077.99287: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.0068359375, "5m": 0.1484375, "15m": 0.0927734375}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "27", "second": "57", "epoch": "1726882077", "epoch_int": "1726882077", "date": "2024-09-20", "time": "21:27:57", "iso8601_micro": "2024-09-21T01:27:57.580773Z", "iso8601": "2024-09-21T01:27:57Z", "iso8601_basic": "20240920T212757580773", "iso8601_basic_short": "20240920T212757", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 3026, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 505, "free": 3026}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 384, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261814886400, "block_size": 4096, "block_total": 65519099, "block_available": 63919650, "block_used": 1599449, "inode_total": 131070960, "inode_available": 131029179, "inode_used": 41781, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882077.99482: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882077.99503: _low_level_execute_command(): starting 7557 1726882077.99507: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882077.107164-7757-198524217580669/ > /dev/null 2>&1 && sleep 0' 7557 1726882077.99961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882077.99964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882077.99966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882077.99968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882077.99971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882077.99973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.00029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.00033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882078.00036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.00082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.01852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.01880: stderr chunk (state=3): >>><<< 7557 1726882078.01883: stdout chunk (state=3): >>><<< 7557 1726882078.01898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.01905: handler run complete 7557 1726882078.01980: variable 'ansible_facts' from source: unknown 7557 1726882078.02045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.02222: variable 'ansible_facts' from source: unknown 7557 1726882078.02284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.02362: attempt loop complete, returning result 7557 1726882078.02365: _execute() done 7557 1726882078.02368: dumping result to json 7557 1726882078.02385: done dumping result, returning 7557 1726882078.02396: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-ed48-b3a5-0000000001b1] 7557 1726882078.02399: sending task result for task 12673a56-9f93-ed48-b3a5-0000000001b1 7557 1726882078.02680: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000001b1 7557 1726882078.02683: WORKER PROCESS EXITING ok: [managed_node3] 7557 1726882078.02866: no more pending results, returning what we have 7557 1726882078.02868: results queue empty 7557 1726882078.02868: checking for any_errors_fatal 7557 1726882078.02869: done checking for any_errors_fatal 7557 1726882078.02869: checking for max_fail_percentage 7557 1726882078.02870: done checking for max_fail_percentage 7557 1726882078.02871: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.02871: done checking to see if all hosts have failed 7557 1726882078.02872: getting the remaining hosts for this loop 7557 1726882078.02872: done getting the remaining hosts for this loop 7557 1726882078.02875: getting the next task for host managed_node3 7557 1726882078.02878: done getting next task for host managed_node3 7557 1726882078.02879: ^ task is: TASK: meta (flush_handlers) 7557 1726882078.02880: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.02882: getting variables 7557 1726882078.02883: in VariableManager get_vars() 7557 1726882078.02915: Calling all_inventory to load vars for managed_node3 7557 1726882078.02917: Calling groups_inventory to load vars for managed_node3 7557 1726882078.02918: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.02926: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.02927: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.02929: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.03031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.03144: done with get_vars() 7557 1726882078.03151: done getting variables 7557 1726882078.03201: in VariableManager get_vars() 7557 1726882078.03212: Calling all_inventory to load vars for managed_node3 7557 1726882078.03213: Calling groups_inventory to load vars for managed_node3 7557 1726882078.03215: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.03218: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.03220: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.03221: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.03305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.03412: done with get_vars() 7557 1726882078.03421: done queuing things up, now waiting for results queue to drain 7557 1726882078.03422: results queue empty 7557 1726882078.03422: checking for any_errors_fatal 7557 1726882078.03424: done checking for any_errors_fatal 7557 1726882078.03428: checking for max_fail_percentage 7557 1726882078.03429: done checking for max_fail_percentage 7557 1726882078.03429: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.03430: done checking to see if all hosts have failed 7557 1726882078.03430: getting the remaining hosts for this loop 7557 1726882078.03431: done getting the remaining hosts for this loop 7557 1726882078.03432: getting the next task for host managed_node3 7557 1726882078.03435: done getting next task for host managed_node3 7557 1726882078.03436: ^ task is: TASK: Include the task 'show_interfaces.yml' 7557 1726882078.03437: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.03438: getting variables 7557 1726882078.03439: in VariableManager get_vars() 7557 1726882078.03450: Calling all_inventory to load vars for managed_node3 7557 1726882078.03452: Calling groups_inventory to load vars for managed_node3 7557 1726882078.03454: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.03458: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.03459: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.03461: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.03556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.03663: done with get_vars() 7557 1726882078.03671: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:9 Friday 20 September 2024 21:27:58 -0400 (0:00:00.974) 0:00:03.890 ****** 7557 1726882078.03721: entering _queue_task() for managed_node3/include_tasks 7557 1726882078.03920: worker is 1 (out of 1 available) 7557 1726882078.03933: exiting _queue_task() for managed_node3/include_tasks 7557 1726882078.03944: done queuing things up, now waiting for results queue to drain 7557 1726882078.03945: waiting for pending results... 7557 1726882078.04092: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7557 1726882078.04149: in run() - task 12673a56-9f93-ed48-b3a5-00000000000b 7557 1726882078.04160: variable 'ansible_search_path' from source: unknown 7557 1726882078.04195: calling self._execute() 7557 1726882078.04252: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.04256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.04265: variable 'omit' from source: magic vars 7557 1726882078.04526: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.04535: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.04541: _execute() done 7557 1726882078.04545: dumping result to json 7557 1726882078.04547: done dumping result, returning 7557 1726882078.04553: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-ed48-b3a5-00000000000b] 7557 1726882078.04558: sending task result for task 12673a56-9f93-ed48-b3a5-00000000000b 7557 1726882078.04644: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000000b 7557 1726882078.04646: WORKER PROCESS EXITING 7557 1726882078.04670: no more pending results, returning what we have 7557 1726882078.04675: in VariableManager get_vars() 7557 1726882078.04724: Calling all_inventory to load vars for managed_node3 7557 1726882078.04727: Calling groups_inventory to load vars for managed_node3 7557 1726882078.04729: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.04737: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.04739: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.04742: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.04862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.04974: done with get_vars() 7557 1726882078.04979: variable 'ansible_search_path' from source: unknown 7557 1726882078.04988: we have included files to process 7557 1726882078.04989: generating all_blocks data 7557 1726882078.04990: done generating all_blocks data 7557 1726882078.04991: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882078.04992: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882078.04995: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882078.05095: in VariableManager get_vars() 7557 1726882078.05112: done with get_vars() 7557 1726882078.05183: done processing included file 7557 1726882078.05185: iterating over new_blocks loaded from include file 7557 1726882078.05186: in VariableManager get_vars() 7557 1726882078.05202: done with get_vars() 7557 1726882078.05203: filtering new block on tags 7557 1726882078.05213: done filtering new block on tags 7557 1726882078.05215: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7557 1726882078.05219: extending task lists for all hosts with included blocks 7557 1726882078.07435: done extending task lists 7557 1726882078.07437: done processing included files 7557 1726882078.07438: results queue empty 7557 1726882078.07438: checking for any_errors_fatal 7557 1726882078.07439: done checking for any_errors_fatal 7557 1726882078.07439: checking for max_fail_percentage 7557 1726882078.07440: done checking for max_fail_percentage 7557 1726882078.07441: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.07441: done checking to see if all hosts have failed 7557 1726882078.07442: getting the remaining hosts for this loop 7557 1726882078.07442: done getting the remaining hosts for this loop 7557 1726882078.07444: getting the next task for host managed_node3 7557 1726882078.07446: done getting next task for host managed_node3 7557 1726882078.07448: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7557 1726882078.07449: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.07451: getting variables 7557 1726882078.07452: in VariableManager get_vars() 7557 1726882078.07465: Calling all_inventory to load vars for managed_node3 7557 1726882078.07466: Calling groups_inventory to load vars for managed_node3 7557 1726882078.07467: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.07471: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.07473: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.07474: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.07705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.07813: done with get_vars() 7557 1726882078.07819: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:27:58 -0400 (0:00:00.041) 0:00:03.931 ****** 7557 1726882078.07870: entering _queue_task() for managed_node3/include_tasks 7557 1726882078.08086: worker is 1 (out of 1 available) 7557 1726882078.08098: exiting _queue_task() for managed_node3/include_tasks 7557 1726882078.08112: done queuing things up, now waiting for results queue to drain 7557 1726882078.08114: waiting for pending results... 7557 1726882078.08268: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7557 1726882078.08335: in run() - task 12673a56-9f93-ed48-b3a5-0000000001ca 7557 1726882078.08353: variable 'ansible_search_path' from source: unknown 7557 1726882078.08356: variable 'ansible_search_path' from source: unknown 7557 1726882078.08379: calling self._execute() 7557 1726882078.08446: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.08452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.08467: variable 'omit' from source: magic vars 7557 1726882078.08730: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.08740: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.08745: _execute() done 7557 1726882078.08748: dumping result to json 7557 1726882078.08751: done dumping result, returning 7557 1726882078.08757: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-ed48-b3a5-0000000001ca] 7557 1726882078.08762: sending task result for task 12673a56-9f93-ed48-b3a5-0000000001ca 7557 1726882078.08841: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000001ca 7557 1726882078.08844: WORKER PROCESS EXITING 7557 1726882078.08868: no more pending results, returning what we have 7557 1726882078.08873: in VariableManager get_vars() 7557 1726882078.08925: Calling all_inventory to load vars for managed_node3 7557 1726882078.08928: Calling groups_inventory to load vars for managed_node3 7557 1726882078.08930: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.08941: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.08944: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.08946: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.09083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.09208: done with get_vars() 7557 1726882078.09215: variable 'ansible_search_path' from source: unknown 7557 1726882078.09218: variable 'ansible_search_path' from source: unknown 7557 1726882078.09242: we have included files to process 7557 1726882078.09243: generating all_blocks data 7557 1726882078.09244: done generating all_blocks data 7557 1726882078.09245: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882078.09246: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882078.09247: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882078.09480: done processing included file 7557 1726882078.09482: iterating over new_blocks loaded from include file 7557 1726882078.09483: in VariableManager get_vars() 7557 1726882078.09502: done with get_vars() 7557 1726882078.09503: filtering new block on tags 7557 1726882078.09513: done filtering new block on tags 7557 1726882078.09515: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7557 1726882078.09518: extending task lists for all hosts with included blocks 7557 1726882078.09577: done extending task lists 7557 1726882078.09578: done processing included files 7557 1726882078.09579: results queue empty 7557 1726882078.09579: checking for any_errors_fatal 7557 1726882078.09581: done checking for any_errors_fatal 7557 1726882078.09582: checking for max_fail_percentage 7557 1726882078.09582: done checking for max_fail_percentage 7557 1726882078.09583: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.09583: done checking to see if all hosts have failed 7557 1726882078.09584: getting the remaining hosts for this loop 7557 1726882078.09585: done getting the remaining hosts for this loop 7557 1726882078.09586: getting the next task for host managed_node3 7557 1726882078.09589: done getting next task for host managed_node3 7557 1726882078.09592: ^ task is: TASK: Gather current interface info 7557 1726882078.09596: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.09597: getting variables 7557 1726882078.09598: in VariableManager get_vars() 7557 1726882078.09608: Calling all_inventory to load vars for managed_node3 7557 1726882078.09609: Calling groups_inventory to load vars for managed_node3 7557 1726882078.09611: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.09614: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.09615: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.09618: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.09700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.09816: done with get_vars() 7557 1726882078.09821: done getting variables 7557 1726882078.09846: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:27:58 -0400 (0:00:00.019) 0:00:03.951 ****** 7557 1726882078.09866: entering _queue_task() for managed_node3/command 7557 1726882078.10073: worker is 1 (out of 1 available) 7557 1726882078.10085: exiting _queue_task() for managed_node3/command 7557 1726882078.10101: done queuing things up, now waiting for results queue to drain 7557 1726882078.10103: waiting for pending results... 7557 1726882078.10248: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7557 1726882078.10315: in run() - task 12673a56-9f93-ed48-b3a5-000000000389 7557 1726882078.10326: variable 'ansible_search_path' from source: unknown 7557 1726882078.10331: variable 'ansible_search_path' from source: unknown 7557 1726882078.10358: calling self._execute() 7557 1726882078.10425: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.10429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.10437: variable 'omit' from source: magic vars 7557 1726882078.10728: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.10737: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.10743: variable 'omit' from source: magic vars 7557 1726882078.10772: variable 'omit' from source: magic vars 7557 1726882078.10808: variable 'omit' from source: magic vars 7557 1726882078.10838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882078.10865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882078.10880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882078.10903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.10913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.10936: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882078.10940: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.10942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.11016: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882078.11022: Set connection var ansible_shell_executable to /bin/sh 7557 1726882078.11025: Set connection var ansible_shell_type to sh 7557 1726882078.11029: Set connection var ansible_pipelining to False 7557 1726882078.11032: Set connection var ansible_connection to ssh 7557 1726882078.11037: Set connection var ansible_timeout to 10 7557 1726882078.11052: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.11056: variable 'ansible_connection' from source: unknown 7557 1726882078.11058: variable 'ansible_module_compression' from source: unknown 7557 1726882078.11061: variable 'ansible_shell_type' from source: unknown 7557 1726882078.11063: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.11065: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.11067: variable 'ansible_pipelining' from source: unknown 7557 1726882078.11070: variable 'ansible_timeout' from source: unknown 7557 1726882078.11074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.11175: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882078.11184: variable 'omit' from source: magic vars 7557 1726882078.11189: starting attempt loop 7557 1726882078.11196: running the handler 7557 1726882078.11211: _low_level_execute_command(): starting 7557 1726882078.11225: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882078.11731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.11735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.11739: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882078.11742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.11798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.11801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882078.11804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.11865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.13549: stdout chunk (state=3): >>>/root <<< 7557 1726882078.13672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.13676: stdout chunk (state=3): >>><<< 7557 1726882078.13684: stderr chunk (state=3): >>><<< 7557 1726882078.13711: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.13723: _low_level_execute_command(): starting 7557 1726882078.13729: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092 `" && echo ansible-tmp-1726882078.1370914-7793-72538321704092="` echo /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092 `" ) && sleep 0' 7557 1726882078.14201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882078.14204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882078.14206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882078.14216: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882078.14218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.14268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.14273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.14321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.16161: stdout chunk (state=3): >>>ansible-tmp-1726882078.1370914-7793-72538321704092=/root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092 <<< 7557 1726882078.16261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.16286: stderr chunk (state=3): >>><<< 7557 1726882078.16292: stdout chunk (state=3): >>><<< 7557 1726882078.16309: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882078.1370914-7793-72538321704092=/root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.16450: variable 'ansible_module_compression' from source: unknown 7557 1726882078.16453: ANSIBALLZ: Using generic lock for ansible.legacy.command 7557 1726882078.16455: ANSIBALLZ: Acquiring lock 7557 1726882078.16458: ANSIBALLZ: Lock acquired: 140194287013904 7557 1726882078.16459: ANSIBALLZ: Creating module 7557 1726882078.27016: ANSIBALLZ: Writing module into payload 7557 1726882078.27078: ANSIBALLZ: Writing module 7557 1726882078.27097: ANSIBALLZ: Renaming module 7557 1726882078.27106: ANSIBALLZ: Done creating module 7557 1726882078.27120: variable 'ansible_facts' from source: unknown 7557 1726882078.27164: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/AnsiballZ_command.py 7557 1726882078.27294: Sending initial data 7557 1726882078.27300: Sent initial data (153 bytes) 7557 1726882078.28015: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.28068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.28085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882078.28110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.28199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.30331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882078.30421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882078.30489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpkqy2hj2p /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/AnsiballZ_command.py <<< 7557 1726882078.30492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/AnsiballZ_command.py" <<< 7557 1726882078.30526: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpkqy2hj2p" to remote "/root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/AnsiballZ_command.py" <<< 7557 1726882078.31418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.31421: stdout chunk (state=3): >>><<< 7557 1726882078.31423: stderr chunk (state=3): >>><<< 7557 1726882078.31425: done transferring module to remote 7557 1726882078.31427: _low_level_execute_command(): starting 7557 1726882078.31441: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/ /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/AnsiballZ_command.py && sleep 0' 7557 1726882078.31901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.31913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.31916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882078.31919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882078.31924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.31963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.31966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.32023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.34446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.34457: stderr chunk (state=3): >>><<< 7557 1726882078.34482: stdout chunk (state=3): >>><<< 7557 1726882078.34534: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.34538: _low_level_execute_command(): starting 7557 1726882078.34541: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/AnsiballZ_command.py && sleep 0' 7557 1726882078.34943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.34949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.34963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.35019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.35025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882078.35027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.35077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.57919: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:27:58.573228", "end": "2024-09-20 21:27:58.577390", "delta": "0:00:00.004162", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882078.59929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882078.59952: stderr chunk (state=3): >>><<< 7557 1726882078.59955: stdout chunk (state=3): >>><<< 7557 1726882078.59970: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:27:58.573228", "end": "2024-09-20 21:27:58.577390", "delta": "0:00:00.004162", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882078.60000: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882078.60008: _low_level_execute_command(): starting 7557 1726882078.60013: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882078.1370914-7793-72538321704092/ > /dev/null 2>&1 && sleep 0' 7557 1726882078.60475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882078.60478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882078.60480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.60484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.60492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.60562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882078.60565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.60642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.63263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.63499: stderr chunk (state=3): >>><<< 7557 1726882078.63503: stdout chunk (state=3): >>><<< 7557 1726882078.63506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.63509: handler run complete 7557 1726882078.63511: Evaluated conditional (False): False 7557 1726882078.63513: attempt loop complete, returning result 7557 1726882078.63515: _execute() done 7557 1726882078.63517: dumping result to json 7557 1726882078.63519: done dumping result, returning 7557 1726882078.63521: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-ed48-b3a5-000000000389] 7557 1726882078.63523: sending task result for task 12673a56-9f93-ed48-b3a5-000000000389 7557 1726882078.63613: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000389 7557 1726882078.63616: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004162", "end": "2024-09-20 21:27:58.577390", "rc": 0, "start": "2024-09-20 21:27:58.573228" } STDOUT: eth0 lo 7557 1726882078.63707: no more pending results, returning what we have 7557 1726882078.63711: results queue empty 7557 1726882078.63712: checking for any_errors_fatal 7557 1726882078.63713: done checking for any_errors_fatal 7557 1726882078.63713: checking for max_fail_percentage 7557 1726882078.63715: done checking for max_fail_percentage 7557 1726882078.63716: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.63717: done checking to see if all hosts have failed 7557 1726882078.63717: getting the remaining hosts for this loop 7557 1726882078.63719: done getting the remaining hosts for this loop 7557 1726882078.63723: getting the next task for host managed_node3 7557 1726882078.63736: done getting next task for host managed_node3 7557 1726882078.63738: ^ task is: TASK: Set current_interfaces 7557 1726882078.63743: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.63747: getting variables 7557 1726882078.63749: in VariableManager get_vars() 7557 1726882078.63811: Calling all_inventory to load vars for managed_node3 7557 1726882078.63815: Calling groups_inventory to load vars for managed_node3 7557 1726882078.63818: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.63829: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.63831: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.63834: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.64378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.64877: done with get_vars() 7557 1726882078.64888: done getting variables 7557 1726882078.65069: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:27:58 -0400 (0:00:00.552) 0:00:04.504 ****** 7557 1726882078.65114: entering _queue_task() for managed_node3/set_fact 7557 1726882078.65554: worker is 1 (out of 1 available) 7557 1726882078.65565: exiting _queue_task() for managed_node3/set_fact 7557 1726882078.65577: done queuing things up, now waiting for results queue to drain 7557 1726882078.65578: waiting for pending results... 7557 1726882078.65819: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7557 1726882078.65879: in run() - task 12673a56-9f93-ed48-b3a5-00000000038a 7557 1726882078.65884: variable 'ansible_search_path' from source: unknown 7557 1726882078.65887: variable 'ansible_search_path' from source: unknown 7557 1726882078.65921: calling self._execute() 7557 1726882078.66032: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.66092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.66098: variable 'omit' from source: magic vars 7557 1726882078.66375: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.66385: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.66396: variable 'omit' from source: magic vars 7557 1726882078.66426: variable 'omit' from source: magic vars 7557 1726882078.66497: variable '_current_interfaces' from source: set_fact 7557 1726882078.66548: variable 'omit' from source: magic vars 7557 1726882078.66578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882078.66610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882078.66626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882078.66641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.66651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.66676: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882078.66679: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.66682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.66754: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882078.66760: Set connection var ansible_shell_executable to /bin/sh 7557 1726882078.66762: Set connection var ansible_shell_type to sh 7557 1726882078.66767: Set connection var ansible_pipelining to False 7557 1726882078.66770: Set connection var ansible_connection to ssh 7557 1726882078.66775: Set connection var ansible_timeout to 10 7557 1726882078.66791: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.66799: variable 'ansible_connection' from source: unknown 7557 1726882078.66802: variable 'ansible_module_compression' from source: unknown 7557 1726882078.66804: variable 'ansible_shell_type' from source: unknown 7557 1726882078.66806: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.66810: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.66814: variable 'ansible_pipelining' from source: unknown 7557 1726882078.66816: variable 'ansible_timeout' from source: unknown 7557 1726882078.66821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.66924: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882078.66932: variable 'omit' from source: magic vars 7557 1726882078.66937: starting attempt loop 7557 1726882078.66940: running the handler 7557 1726882078.66950: handler run complete 7557 1726882078.66960: attempt loop complete, returning result 7557 1726882078.66962: _execute() done 7557 1726882078.66964: dumping result to json 7557 1726882078.66967: done dumping result, returning 7557 1726882078.66978: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-ed48-b3a5-00000000038a] 7557 1726882078.66980: sending task result for task 12673a56-9f93-ed48-b3a5-00000000038a 7557 1726882078.67052: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000038a 7557 1726882078.67055: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7557 1726882078.67152: no more pending results, returning what we have 7557 1726882078.67154: results queue empty 7557 1726882078.67155: checking for any_errors_fatal 7557 1726882078.67160: done checking for any_errors_fatal 7557 1726882078.67161: checking for max_fail_percentage 7557 1726882078.67163: done checking for max_fail_percentage 7557 1726882078.67163: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.67164: done checking to see if all hosts have failed 7557 1726882078.67164: getting the remaining hosts for this loop 7557 1726882078.67166: done getting the remaining hosts for this loop 7557 1726882078.67169: getting the next task for host managed_node3 7557 1726882078.67174: done getting next task for host managed_node3 7557 1726882078.67176: ^ task is: TASK: Show current_interfaces 7557 1726882078.67179: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.67182: getting variables 7557 1726882078.67183: in VariableManager get_vars() 7557 1726882078.67225: Calling all_inventory to load vars for managed_node3 7557 1726882078.67227: Calling groups_inventory to load vars for managed_node3 7557 1726882078.67228: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.67234: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.67236: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.67238: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.67346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.67464: done with get_vars() 7557 1726882078.67471: done getting variables 7557 1726882078.67534: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:27:58 -0400 (0:00:00.024) 0:00:04.528 ****** 7557 1726882078.67556: entering _queue_task() for managed_node3/debug 7557 1726882078.67557: Creating lock for debug 7557 1726882078.67737: worker is 1 (out of 1 available) 7557 1726882078.67747: exiting _queue_task() for managed_node3/debug 7557 1726882078.67759: done queuing things up, now waiting for results queue to drain 7557 1726882078.67760: waiting for pending results... 7557 1726882078.67905: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7557 1726882078.67966: in run() - task 12673a56-9f93-ed48-b3a5-0000000001cb 7557 1726882078.67976: variable 'ansible_search_path' from source: unknown 7557 1726882078.67981: variable 'ansible_search_path' from source: unknown 7557 1726882078.68014: calling self._execute() 7557 1726882078.68132: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.68135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.68144: variable 'omit' from source: magic vars 7557 1726882078.68392: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.68406: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.68412: variable 'omit' from source: magic vars 7557 1726882078.68442: variable 'omit' from source: magic vars 7557 1726882078.68508: variable 'current_interfaces' from source: set_fact 7557 1726882078.68529: variable 'omit' from source: magic vars 7557 1726882078.68560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882078.68586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882078.68605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882078.68618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.68628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.68652: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882078.68658: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.68661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.68733: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882078.68738: Set connection var ansible_shell_executable to /bin/sh 7557 1726882078.68741: Set connection var ansible_shell_type to sh 7557 1726882078.68746: Set connection var ansible_pipelining to False 7557 1726882078.68748: Set connection var ansible_connection to ssh 7557 1726882078.68759: Set connection var ansible_timeout to 10 7557 1726882078.68774: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.68778: variable 'ansible_connection' from source: unknown 7557 1726882078.68780: variable 'ansible_module_compression' from source: unknown 7557 1726882078.68782: variable 'ansible_shell_type' from source: unknown 7557 1726882078.68785: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.68787: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.68789: variable 'ansible_pipelining' from source: unknown 7557 1726882078.68795: variable 'ansible_timeout' from source: unknown 7557 1726882078.68798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.68897: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882078.68908: variable 'omit' from source: magic vars 7557 1726882078.68911: starting attempt loop 7557 1726882078.68916: running the handler 7557 1726882078.68949: handler run complete 7557 1726882078.68958: attempt loop complete, returning result 7557 1726882078.68961: _execute() done 7557 1726882078.68964: dumping result to json 7557 1726882078.68966: done dumping result, returning 7557 1726882078.68977: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-ed48-b3a5-0000000001cb] 7557 1726882078.68980: sending task result for task 12673a56-9f93-ed48-b3a5-0000000001cb 7557 1726882078.69058: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000001cb 7557 1726882078.69061: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7557 1726882078.69174: no more pending results, returning what we have 7557 1726882078.69177: results queue empty 7557 1726882078.69178: checking for any_errors_fatal 7557 1726882078.69180: done checking for any_errors_fatal 7557 1726882078.69181: checking for max_fail_percentage 7557 1726882078.69183: done checking for max_fail_percentage 7557 1726882078.69183: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.69184: done checking to see if all hosts have failed 7557 1726882078.69184: getting the remaining hosts for this loop 7557 1726882078.69186: done getting the remaining hosts for this loop 7557 1726882078.69188: getting the next task for host managed_node3 7557 1726882078.69195: done getting next task for host managed_node3 7557 1726882078.69198: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7557 1726882078.69199: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.69202: getting variables 7557 1726882078.69204: in VariableManager get_vars() 7557 1726882078.69237: Calling all_inventory to load vars for managed_node3 7557 1726882078.69239: Calling groups_inventory to load vars for managed_node3 7557 1726882078.69240: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.69247: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.69248: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.69250: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.69351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.69465: done with get_vars() 7557 1726882078.69472: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:11 Friday 20 September 2024 21:27:58 -0400 (0:00:00.019) 0:00:04.548 ****** 7557 1726882078.69531: entering _queue_task() for managed_node3/include_tasks 7557 1726882078.69711: worker is 1 (out of 1 available) 7557 1726882078.69726: exiting _queue_task() for managed_node3/include_tasks 7557 1726882078.69739: done queuing things up, now waiting for results queue to drain 7557 1726882078.69740: waiting for pending results... 7557 1726882078.69873: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7557 1726882078.69930: in run() - task 12673a56-9f93-ed48-b3a5-00000000000c 7557 1726882078.69942: variable 'ansible_search_path' from source: unknown 7557 1726882078.69975: calling self._execute() 7557 1726882078.70045: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.70050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.70058: variable 'omit' from source: magic vars 7557 1726882078.70326: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.70335: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.70341: _execute() done 7557 1726882078.70344: dumping result to json 7557 1726882078.70346: done dumping result, returning 7557 1726882078.70353: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-ed48-b3a5-00000000000c] 7557 1726882078.70358: sending task result for task 12673a56-9f93-ed48-b3a5-00000000000c 7557 1726882078.70440: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000000c 7557 1726882078.70444: WORKER PROCESS EXITING 7557 1726882078.70467: no more pending results, returning what we have 7557 1726882078.70471: in VariableManager get_vars() 7557 1726882078.70520: Calling all_inventory to load vars for managed_node3 7557 1726882078.70523: Calling groups_inventory to load vars for managed_node3 7557 1726882078.70525: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.70534: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.70536: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.70538: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.70650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.70781: done with get_vars() 7557 1726882078.70788: variable 'ansible_search_path' from source: unknown 7557 1726882078.70801: we have included files to process 7557 1726882078.70801: generating all_blocks data 7557 1726882078.70802: done generating all_blocks data 7557 1726882078.70805: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882078.70806: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882078.70807: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882078.71124: in VariableManager get_vars() 7557 1726882078.71141: done with get_vars() 7557 1726882078.71285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 7557 1726882078.71673: done processing included file 7557 1726882078.71675: iterating over new_blocks loaded from include file 7557 1726882078.71676: in VariableManager get_vars() 7557 1726882078.71689: done with get_vars() 7557 1726882078.71692: filtering new block on tags 7557 1726882078.71711: done filtering new block on tags 7557 1726882078.71713: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7557 1726882078.71716: extending task lists for all hosts with included blocks 7557 1726882078.73902: done extending task lists 7557 1726882078.73903: done processing included files 7557 1726882078.73904: results queue empty 7557 1726882078.73904: checking for any_errors_fatal 7557 1726882078.73906: done checking for any_errors_fatal 7557 1726882078.73907: checking for max_fail_percentage 7557 1726882078.73908: done checking for max_fail_percentage 7557 1726882078.73908: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.73909: done checking to see if all hosts have failed 7557 1726882078.73909: getting the remaining hosts for this loop 7557 1726882078.73910: done getting the remaining hosts for this loop 7557 1726882078.73911: getting the next task for host managed_node3 7557 1726882078.73914: done getting next task for host managed_node3 7557 1726882078.73915: ^ task is: TASK: Ensure state in ["present", "absent"] 7557 1726882078.73917: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.73918: getting variables 7557 1726882078.73919: in VariableManager get_vars() 7557 1726882078.73933: Calling all_inventory to load vars for managed_node3 7557 1726882078.73935: Calling groups_inventory to load vars for managed_node3 7557 1726882078.73936: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.73940: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.73942: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.73943: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.74032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.74153: done with get_vars() 7557 1726882078.74160: done getting variables 7557 1726882078.74208: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:27:58 -0400 (0:00:00.046) 0:00:04.595 ****** 7557 1726882078.74226: entering _queue_task() for managed_node3/fail 7557 1726882078.74228: Creating lock for fail 7557 1726882078.74447: worker is 1 (out of 1 available) 7557 1726882078.74459: exiting _queue_task() for managed_node3/fail 7557 1726882078.74471: done queuing things up, now waiting for results queue to drain 7557 1726882078.74472: waiting for pending results... 7557 1726882078.74621: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7557 1726882078.74678: in run() - task 12673a56-9f93-ed48-b3a5-0000000003a5 7557 1726882078.74687: variable 'ansible_search_path' from source: unknown 7557 1726882078.74696: variable 'ansible_search_path' from source: unknown 7557 1726882078.74723: calling self._execute() 7557 1726882078.74784: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.74787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.74800: variable 'omit' from source: magic vars 7557 1726882078.75064: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.75074: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.75167: variable 'state' from source: include params 7557 1726882078.75172: Evaluated conditional (state not in ["present", "absent"]): False 7557 1726882078.75175: when evaluation is False, skipping this task 7557 1726882078.75178: _execute() done 7557 1726882078.75181: dumping result to json 7557 1726882078.75184: done dumping result, returning 7557 1726882078.75189: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-ed48-b3a5-0000000003a5] 7557 1726882078.75196: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a5 7557 1726882078.75275: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a5 7557 1726882078.75277: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7557 1726882078.75328: no more pending results, returning what we have 7557 1726882078.75332: results queue empty 7557 1726882078.75333: checking for any_errors_fatal 7557 1726882078.75334: done checking for any_errors_fatal 7557 1726882078.75335: checking for max_fail_percentage 7557 1726882078.75337: done checking for max_fail_percentage 7557 1726882078.75337: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.75338: done checking to see if all hosts have failed 7557 1726882078.75339: getting the remaining hosts for this loop 7557 1726882078.75340: done getting the remaining hosts for this loop 7557 1726882078.75343: getting the next task for host managed_node3 7557 1726882078.75347: done getting next task for host managed_node3 7557 1726882078.75350: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882078.75353: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.75355: getting variables 7557 1726882078.75357: in VariableManager get_vars() 7557 1726882078.75401: Calling all_inventory to load vars for managed_node3 7557 1726882078.75403: Calling groups_inventory to load vars for managed_node3 7557 1726882078.75405: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.75414: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.75416: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.75419: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.75529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.75645: done with get_vars() 7557 1726882078.75652: done getting variables 7557 1726882078.75690: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:27:58 -0400 (0:00:00.014) 0:00:04.610 ****** 7557 1726882078.75711: entering _queue_task() for managed_node3/fail 7557 1726882078.75879: worker is 1 (out of 1 available) 7557 1726882078.75894: exiting _queue_task() for managed_node3/fail 7557 1726882078.75906: done queuing things up, now waiting for results queue to drain 7557 1726882078.75907: waiting for pending results... 7557 1726882078.76055: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882078.76122: in run() - task 12673a56-9f93-ed48-b3a5-0000000003a6 7557 1726882078.76137: variable 'ansible_search_path' from source: unknown 7557 1726882078.76141: variable 'ansible_search_path' from source: unknown 7557 1726882078.76168: calling self._execute() 7557 1726882078.76232: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.76237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.76247: variable 'omit' from source: magic vars 7557 1726882078.76508: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.76518: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.76615: variable 'type' from source: play vars 7557 1726882078.76621: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7557 1726882078.76623: when evaluation is False, skipping this task 7557 1726882078.76626: _execute() done 7557 1726882078.76629: dumping result to json 7557 1726882078.76631: done dumping result, returning 7557 1726882078.76637: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-ed48-b3a5-0000000003a6] 7557 1726882078.76642: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a6 7557 1726882078.76719: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a6 7557 1726882078.76722: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7557 1726882078.76765: no more pending results, returning what we have 7557 1726882078.76769: results queue empty 7557 1726882078.76770: checking for any_errors_fatal 7557 1726882078.76776: done checking for any_errors_fatal 7557 1726882078.76777: checking for max_fail_percentage 7557 1726882078.76778: done checking for max_fail_percentage 7557 1726882078.76779: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.76780: done checking to see if all hosts have failed 7557 1726882078.76780: getting the remaining hosts for this loop 7557 1726882078.76782: done getting the remaining hosts for this loop 7557 1726882078.76785: getting the next task for host managed_node3 7557 1726882078.76791: done getting next task for host managed_node3 7557 1726882078.76796: ^ task is: TASK: Include the task 'show_interfaces.yml' 7557 1726882078.76798: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.76801: getting variables 7557 1726882078.76803: in VariableManager get_vars() 7557 1726882078.76849: Calling all_inventory to load vars for managed_node3 7557 1726882078.76851: Calling groups_inventory to load vars for managed_node3 7557 1726882078.76853: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.76861: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.76863: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.76866: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.77004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.77120: done with get_vars() 7557 1726882078.77127: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:27:58 -0400 (0:00:00.014) 0:00:04.624 ****** 7557 1726882078.77186: entering _queue_task() for managed_node3/include_tasks 7557 1726882078.77357: worker is 1 (out of 1 available) 7557 1726882078.77369: exiting _queue_task() for managed_node3/include_tasks 7557 1726882078.77380: done queuing things up, now waiting for results queue to drain 7557 1726882078.77382: waiting for pending results... 7557 1726882078.77526: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7557 1726882078.77587: in run() - task 12673a56-9f93-ed48-b3a5-0000000003a7 7557 1726882078.77601: variable 'ansible_search_path' from source: unknown 7557 1726882078.77606: variable 'ansible_search_path' from source: unknown 7557 1726882078.77635: calling self._execute() 7557 1726882078.77699: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.77703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.77710: variable 'omit' from source: magic vars 7557 1726882078.77962: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.77971: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.77977: _execute() done 7557 1726882078.77980: dumping result to json 7557 1726882078.77982: done dumping result, returning 7557 1726882078.77988: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-ed48-b3a5-0000000003a7] 7557 1726882078.77996: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a7 7557 1726882078.78074: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a7 7557 1726882078.78076: WORKER PROCESS EXITING 7557 1726882078.78112: no more pending results, returning what we have 7557 1726882078.78117: in VariableManager get_vars() 7557 1726882078.78160: Calling all_inventory to load vars for managed_node3 7557 1726882078.78162: Calling groups_inventory to load vars for managed_node3 7557 1726882078.78165: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.78173: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.78175: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.78178: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.78289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.78407: done with get_vars() 7557 1726882078.78412: variable 'ansible_search_path' from source: unknown 7557 1726882078.78413: variable 'ansible_search_path' from source: unknown 7557 1726882078.78438: we have included files to process 7557 1726882078.78439: generating all_blocks data 7557 1726882078.78440: done generating all_blocks data 7557 1726882078.78444: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882078.78444: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882078.78446: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882078.78510: in VariableManager get_vars() 7557 1726882078.78530: done with get_vars() 7557 1726882078.78605: done processing included file 7557 1726882078.78606: iterating over new_blocks loaded from include file 7557 1726882078.78607: in VariableManager get_vars() 7557 1726882078.78621: done with get_vars() 7557 1726882078.78622: filtering new block on tags 7557 1726882078.78632: done filtering new block on tags 7557 1726882078.78633: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7557 1726882078.78638: extending task lists for all hosts with included blocks 7557 1726882078.78897: done extending task lists 7557 1726882078.78898: done processing included files 7557 1726882078.78898: results queue empty 7557 1726882078.78899: checking for any_errors_fatal 7557 1726882078.78901: done checking for any_errors_fatal 7557 1726882078.78901: checking for max_fail_percentage 7557 1726882078.78902: done checking for max_fail_percentage 7557 1726882078.78902: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.78903: done checking to see if all hosts have failed 7557 1726882078.78903: getting the remaining hosts for this loop 7557 1726882078.78904: done getting the remaining hosts for this loop 7557 1726882078.78905: getting the next task for host managed_node3 7557 1726882078.78908: done getting next task for host managed_node3 7557 1726882078.78909: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7557 1726882078.78911: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.78913: getting variables 7557 1726882078.78913: in VariableManager get_vars() 7557 1726882078.78924: Calling all_inventory to load vars for managed_node3 7557 1726882078.78925: Calling groups_inventory to load vars for managed_node3 7557 1726882078.78926: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.78930: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.78931: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.78933: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.79014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.79124: done with get_vars() 7557 1726882078.79130: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:27:58 -0400 (0:00:00.019) 0:00:04.644 ****** 7557 1726882078.79176: entering _queue_task() for managed_node3/include_tasks 7557 1726882078.79366: worker is 1 (out of 1 available) 7557 1726882078.79378: exiting _queue_task() for managed_node3/include_tasks 7557 1726882078.79394: done queuing things up, now waiting for results queue to drain 7557 1726882078.79396: waiting for pending results... 7557 1726882078.79536: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7557 1726882078.79601: in run() - task 12673a56-9f93-ed48-b3a5-00000000057e 7557 1726882078.79612: variable 'ansible_search_path' from source: unknown 7557 1726882078.79618: variable 'ansible_search_path' from source: unknown 7557 1726882078.79646: calling self._execute() 7557 1726882078.79706: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.79711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.79720: variable 'omit' from source: magic vars 7557 1726882078.79971: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.79981: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.79987: _execute() done 7557 1726882078.79992: dumping result to json 7557 1726882078.79997: done dumping result, returning 7557 1726882078.80000: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-ed48-b3a5-00000000057e] 7557 1726882078.80006: sending task result for task 12673a56-9f93-ed48-b3a5-00000000057e 7557 1726882078.80081: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000057e 7557 1726882078.80084: WORKER PROCESS EXITING 7557 1726882078.80118: no more pending results, returning what we have 7557 1726882078.80122: in VariableManager get_vars() 7557 1726882078.80168: Calling all_inventory to load vars for managed_node3 7557 1726882078.80170: Calling groups_inventory to load vars for managed_node3 7557 1726882078.80173: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.80181: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.80184: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.80186: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.80322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.80439: done with get_vars() 7557 1726882078.80445: variable 'ansible_search_path' from source: unknown 7557 1726882078.80446: variable 'ansible_search_path' from source: unknown 7557 1726882078.80481: we have included files to process 7557 1726882078.80482: generating all_blocks data 7557 1726882078.80483: done generating all_blocks data 7557 1726882078.80484: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882078.80485: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882078.80486: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882078.80649: done processing included file 7557 1726882078.80650: iterating over new_blocks loaded from include file 7557 1726882078.80652: in VariableManager get_vars() 7557 1726882078.80669: done with get_vars() 7557 1726882078.80670: filtering new block on tags 7557 1726882078.80681: done filtering new block on tags 7557 1726882078.80682: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7557 1726882078.80685: extending task lists for all hosts with included blocks 7557 1726882078.80776: done extending task lists 7557 1726882078.80777: done processing included files 7557 1726882078.80778: results queue empty 7557 1726882078.80778: checking for any_errors_fatal 7557 1726882078.80780: done checking for any_errors_fatal 7557 1726882078.80780: checking for max_fail_percentage 7557 1726882078.80781: done checking for max_fail_percentage 7557 1726882078.80781: checking to see if all hosts have failed and the running result is not ok 7557 1726882078.80782: done checking to see if all hosts have failed 7557 1726882078.80782: getting the remaining hosts for this loop 7557 1726882078.80783: done getting the remaining hosts for this loop 7557 1726882078.80784: getting the next task for host managed_node3 7557 1726882078.80787: done getting next task for host managed_node3 7557 1726882078.80788: ^ task is: TASK: Gather current interface info 7557 1726882078.80794: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882078.80796: getting variables 7557 1726882078.80796: in VariableManager get_vars() 7557 1726882078.80807: Calling all_inventory to load vars for managed_node3 7557 1726882078.80808: Calling groups_inventory to load vars for managed_node3 7557 1726882078.80809: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882078.80813: Calling all_plugins_play to load vars for managed_node3 7557 1726882078.80814: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882078.80815: Calling groups_plugins_play to load vars for managed_node3 7557 1726882078.80900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882078.81012: done with get_vars() 7557 1726882078.81018: done getting variables 7557 1726882078.81050: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:27:58 -0400 (0:00:00.018) 0:00:04.663 ****** 7557 1726882078.81078: entering _queue_task() for managed_node3/command 7557 1726882078.81268: worker is 1 (out of 1 available) 7557 1726882078.81280: exiting _queue_task() for managed_node3/command 7557 1726882078.81295: done queuing things up, now waiting for results queue to drain 7557 1726882078.81296: waiting for pending results... 7557 1726882078.81435: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7557 1726882078.81504: in run() - task 12673a56-9f93-ed48-b3a5-0000000005b5 7557 1726882078.81516: variable 'ansible_search_path' from source: unknown 7557 1726882078.81520: variable 'ansible_search_path' from source: unknown 7557 1726882078.81544: calling self._execute() 7557 1726882078.81608: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.81612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.81621: variable 'omit' from source: magic vars 7557 1726882078.82118: variable 'ansible_distribution_major_version' from source: facts 7557 1726882078.82128: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882078.82134: variable 'omit' from source: magic vars 7557 1726882078.82165: variable 'omit' from source: magic vars 7557 1726882078.82189: variable 'omit' from source: magic vars 7557 1726882078.82222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882078.82248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882078.82262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882078.82300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.82379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882078.82382: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882078.82385: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.82387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.82504: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882078.82507: Set connection var ansible_shell_executable to /bin/sh 7557 1726882078.82509: Set connection var ansible_shell_type to sh 7557 1726882078.82512: Set connection var ansible_pipelining to False 7557 1726882078.82514: Set connection var ansible_connection to ssh 7557 1726882078.82516: Set connection var ansible_timeout to 10 7557 1726882078.82524: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.82531: variable 'ansible_connection' from source: unknown 7557 1726882078.82538: variable 'ansible_module_compression' from source: unknown 7557 1726882078.82545: variable 'ansible_shell_type' from source: unknown 7557 1726882078.82552: variable 'ansible_shell_executable' from source: unknown 7557 1726882078.82559: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882078.82566: variable 'ansible_pipelining' from source: unknown 7557 1726882078.82573: variable 'ansible_timeout' from source: unknown 7557 1726882078.82581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882078.82744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882078.82759: variable 'omit' from source: magic vars 7557 1726882078.82829: starting attempt loop 7557 1726882078.82832: running the handler 7557 1726882078.82834: _low_level_execute_command(): starting 7557 1726882078.82836: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882078.83486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.83506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882078.83519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.83558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.83585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.83640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.85858: stdout chunk (state=3): >>>/root <<< 7557 1726882078.86152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.86155: stdout chunk (state=3): >>><<< 7557 1726882078.86157: stderr chunk (state=3): >>><<< 7557 1726882078.86160: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.86162: _low_level_execute_command(): starting 7557 1726882078.86164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487 `" && echo ansible-tmp-1726882078.8607264-7830-23624317994487="` echo /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487 `" ) && sleep 0' 7557 1726882078.86708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882078.86731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882078.86751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.86769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882078.86814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.86909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.86920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882078.86977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.87035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.89742: stdout chunk (state=3): >>>ansible-tmp-1726882078.8607264-7830-23624317994487=/root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487 <<< 7557 1726882078.89972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.89976: stdout chunk (state=3): >>><<< 7557 1726882078.89979: stderr chunk (state=3): >>><<< 7557 1726882078.89999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882078.8607264-7830-23624317994487=/root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.90036: variable 'ansible_module_compression' from source: unknown 7557 1726882078.90207: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882078.90210: variable 'ansible_facts' from source: unknown 7557 1726882078.90231: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/AnsiballZ_command.py 7557 1726882078.90415: Sending initial data 7557 1726882078.90453: Sent initial data (153 bytes) 7557 1726882078.90871: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.90884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.90930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882078.90943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.91003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.93199: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882078.93248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882078.93297: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpm0lp84b0 /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/AnsiballZ_command.py <<< 7557 1726882078.93300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/AnsiballZ_command.py" <<< 7557 1726882078.93343: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpm0lp84b0" to remote "/root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/AnsiballZ_command.py" <<< 7557 1726882078.93916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.93955: stderr chunk (state=3): >>><<< 7557 1726882078.93959: stdout chunk (state=3): >>><<< 7557 1726882078.93999: done transferring module to remote 7557 1726882078.94016: _low_level_execute_command(): starting 7557 1726882078.94020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/ /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/AnsiballZ_command.py && sleep 0' 7557 1726882078.94448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.94452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882078.94454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882078.94456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882078.94458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.94508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882078.94558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.94616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882078.96770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882078.96797: stderr chunk (state=3): >>><<< 7557 1726882078.96801: stdout chunk (state=3): >>><<< 7557 1726882078.96812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882078.96815: _low_level_execute_command(): starting 7557 1726882078.96821: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/AnsiballZ_command.py && sleep 0' 7557 1726882078.97249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.97253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.97255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882078.97257: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882078.97259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882078.97311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882078.97376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882079.16653: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:27:59.160787", "end": "2024-09-20 21:27:59.164882", "delta": "0:00:00.004095", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882079.18646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882079.18665: stderr chunk (state=3): >>><<< 7557 1726882079.18668: stdout chunk (state=3): >>><<< 7557 1726882079.18684: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:27:59.160787", "end": "2024-09-20 21:27:59.164882", "delta": "0:00:00.004095", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882079.18719: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882079.18725: _low_level_execute_command(): starting 7557 1726882079.18730: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882078.8607264-7830-23624317994487/ > /dev/null 2>&1 && sleep 0' 7557 1726882079.19155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882079.19159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.19161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882079.19163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.19216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882079.19222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882079.19272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882079.21783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882079.21810: stderr chunk (state=3): >>><<< 7557 1726882079.21813: stdout chunk (state=3): >>><<< 7557 1726882079.21826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882079.21832: handler run complete 7557 1726882079.21851: Evaluated conditional (False): False 7557 1726882079.21857: attempt loop complete, returning result 7557 1726882079.21860: _execute() done 7557 1726882079.21862: dumping result to json 7557 1726882079.21867: done dumping result, returning 7557 1726882079.21874: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-ed48-b3a5-0000000005b5] 7557 1726882079.21878: sending task result for task 12673a56-9f93-ed48-b3a5-0000000005b5 7557 1726882079.21977: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000005b5 7557 1726882079.21979: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004095", "end": "2024-09-20 21:27:59.164882", "rc": 0, "start": "2024-09-20 21:27:59.160787" } STDOUT: eth0 lo 7557 1726882079.22340: no more pending results, returning what we have 7557 1726882079.22342: results queue empty 7557 1726882079.22342: checking for any_errors_fatal 7557 1726882079.22343: done checking for any_errors_fatal 7557 1726882079.22344: checking for max_fail_percentage 7557 1726882079.22345: done checking for max_fail_percentage 7557 1726882079.22345: checking to see if all hosts have failed and the running result is not ok 7557 1726882079.22346: done checking to see if all hosts have failed 7557 1726882079.22346: getting the remaining hosts for this loop 7557 1726882079.22347: done getting the remaining hosts for this loop 7557 1726882079.22349: getting the next task for host managed_node3 7557 1726882079.22352: done getting next task for host managed_node3 7557 1726882079.22354: ^ task is: TASK: Set current_interfaces 7557 1726882079.22358: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882079.22360: getting variables 7557 1726882079.22361: in VariableManager get_vars() 7557 1726882079.22385: Calling all_inventory to load vars for managed_node3 7557 1726882079.22386: Calling groups_inventory to load vars for managed_node3 7557 1726882079.22388: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882079.22397: Calling all_plugins_play to load vars for managed_node3 7557 1726882079.22399: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882079.22401: Calling groups_plugins_play to load vars for managed_node3 7557 1726882079.22498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882079.22612: done with get_vars() 7557 1726882079.22620: done getting variables 7557 1726882079.22662: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:27:59 -0400 (0:00:00.416) 0:00:05.079 ****** 7557 1726882079.22682: entering _queue_task() for managed_node3/set_fact 7557 1726882079.23112: worker is 1 (out of 1 available) 7557 1726882079.23119: exiting _queue_task() for managed_node3/set_fact 7557 1726882079.23129: done queuing things up, now waiting for results queue to drain 7557 1726882079.23131: waiting for pending results... 7557 1726882079.23257: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7557 1726882079.23296: in run() - task 12673a56-9f93-ed48-b3a5-0000000005b6 7557 1726882079.23317: variable 'ansible_search_path' from source: unknown 7557 1726882079.23326: variable 'ansible_search_path' from source: unknown 7557 1726882079.23370: calling self._execute() 7557 1726882079.23467: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.23480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.23497: variable 'omit' from source: magic vars 7557 1726882079.23863: variable 'ansible_distribution_major_version' from source: facts 7557 1726882079.23882: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882079.23998: variable 'omit' from source: magic vars 7557 1726882079.24004: variable 'omit' from source: magic vars 7557 1726882079.24064: variable '_current_interfaces' from source: set_fact 7557 1726882079.24132: variable 'omit' from source: magic vars 7557 1726882079.24192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882079.24215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882079.24228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882079.24242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882079.24253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882079.24276: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882079.24279: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.24282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.24366: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882079.24372: Set connection var ansible_shell_executable to /bin/sh 7557 1726882079.24375: Set connection var ansible_shell_type to sh 7557 1726882079.24379: Set connection var ansible_pipelining to False 7557 1726882079.24382: Set connection var ansible_connection to ssh 7557 1726882079.24387: Set connection var ansible_timeout to 10 7557 1726882079.24405: variable 'ansible_shell_executable' from source: unknown 7557 1726882079.24409: variable 'ansible_connection' from source: unknown 7557 1726882079.24411: variable 'ansible_module_compression' from source: unknown 7557 1726882079.24414: variable 'ansible_shell_type' from source: unknown 7557 1726882079.24416: variable 'ansible_shell_executable' from source: unknown 7557 1726882079.24421: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.24424: variable 'ansible_pipelining' from source: unknown 7557 1726882079.24428: variable 'ansible_timeout' from source: unknown 7557 1726882079.24431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.24528: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882079.24537: variable 'omit' from source: magic vars 7557 1726882079.24547: starting attempt loop 7557 1726882079.24550: running the handler 7557 1726882079.24560: handler run complete 7557 1726882079.24563: attempt loop complete, returning result 7557 1726882079.24566: _execute() done 7557 1726882079.24569: dumping result to json 7557 1726882079.24571: done dumping result, returning 7557 1726882079.24579: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-ed48-b3a5-0000000005b6] 7557 1726882079.24582: sending task result for task 12673a56-9f93-ed48-b3a5-0000000005b6 7557 1726882079.24656: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000005b6 7557 1726882079.24660: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7557 1726882079.24723: no more pending results, returning what we have 7557 1726882079.24725: results queue empty 7557 1726882079.24726: checking for any_errors_fatal 7557 1726882079.24735: done checking for any_errors_fatal 7557 1726882079.24735: checking for max_fail_percentage 7557 1726882079.24737: done checking for max_fail_percentage 7557 1726882079.24737: checking to see if all hosts have failed and the running result is not ok 7557 1726882079.24738: done checking to see if all hosts have failed 7557 1726882079.24739: getting the remaining hosts for this loop 7557 1726882079.24741: done getting the remaining hosts for this loop 7557 1726882079.24744: getting the next task for host managed_node3 7557 1726882079.24751: done getting next task for host managed_node3 7557 1726882079.24753: ^ task is: TASK: Show current_interfaces 7557 1726882079.24757: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882079.24760: getting variables 7557 1726882079.24761: in VariableManager get_vars() 7557 1726882079.24813: Calling all_inventory to load vars for managed_node3 7557 1726882079.24816: Calling groups_inventory to load vars for managed_node3 7557 1726882079.24819: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882079.24827: Calling all_plugins_play to load vars for managed_node3 7557 1726882079.24830: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882079.24832: Calling groups_plugins_play to load vars for managed_node3 7557 1726882079.24945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882079.25088: done with get_vars() 7557 1726882079.25100: done getting variables 7557 1726882079.25139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:27:59 -0400 (0:00:00.024) 0:00:05.104 ****** 7557 1726882079.25159: entering _queue_task() for managed_node3/debug 7557 1726882079.25341: worker is 1 (out of 1 available) 7557 1726882079.25354: exiting _queue_task() for managed_node3/debug 7557 1726882079.25365: done queuing things up, now waiting for results queue to drain 7557 1726882079.25366: waiting for pending results... 7557 1726882079.25514: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7557 1726882079.25582: in run() - task 12673a56-9f93-ed48-b3a5-00000000057f 7557 1726882079.25605: variable 'ansible_search_path' from source: unknown 7557 1726882079.25609: variable 'ansible_search_path' from source: unknown 7557 1726882079.25629: calling self._execute() 7557 1726882079.25728: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.25732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.25735: variable 'omit' from source: magic vars 7557 1726882079.26098: variable 'ansible_distribution_major_version' from source: facts 7557 1726882079.26101: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882079.26104: variable 'omit' from source: magic vars 7557 1726882079.26114: variable 'omit' from source: magic vars 7557 1726882079.26206: variable 'current_interfaces' from source: set_fact 7557 1726882079.26240: variable 'omit' from source: magic vars 7557 1726882079.26278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882079.26319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882079.26347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882079.26372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882079.26387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882079.26501: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882079.26504: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.26506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.26542: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882079.26554: Set connection var ansible_shell_executable to /bin/sh 7557 1726882079.26561: Set connection var ansible_shell_type to sh 7557 1726882079.26576: Set connection var ansible_pipelining to False 7557 1726882079.26579: Set connection var ansible_connection to ssh 7557 1726882079.26582: Set connection var ansible_timeout to 10 7557 1726882079.26603: variable 'ansible_shell_executable' from source: unknown 7557 1726882079.26606: variable 'ansible_connection' from source: unknown 7557 1726882079.26609: variable 'ansible_module_compression' from source: unknown 7557 1726882079.26611: variable 'ansible_shell_type' from source: unknown 7557 1726882079.26614: variable 'ansible_shell_executable' from source: unknown 7557 1726882079.26616: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.26621: variable 'ansible_pipelining' from source: unknown 7557 1726882079.26623: variable 'ansible_timeout' from source: unknown 7557 1726882079.26627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.26756: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882079.26763: variable 'omit' from source: magic vars 7557 1726882079.26766: starting attempt loop 7557 1726882079.26768: running the handler 7557 1726882079.26898: handler run complete 7557 1726882079.26902: attempt loop complete, returning result 7557 1726882079.26904: _execute() done 7557 1726882079.26906: dumping result to json 7557 1726882079.26908: done dumping result, returning 7557 1726882079.26910: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-ed48-b3a5-00000000057f] 7557 1726882079.26912: sending task result for task 12673a56-9f93-ed48-b3a5-00000000057f 7557 1726882079.26972: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000057f 7557 1726882079.26975: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7557 1726882079.27034: no more pending results, returning what we have 7557 1726882079.27037: results queue empty 7557 1726882079.27038: checking for any_errors_fatal 7557 1726882079.27042: done checking for any_errors_fatal 7557 1726882079.27042: checking for max_fail_percentage 7557 1726882079.27044: done checking for max_fail_percentage 7557 1726882079.27045: checking to see if all hosts have failed and the running result is not ok 7557 1726882079.27045: done checking to see if all hosts have failed 7557 1726882079.27046: getting the remaining hosts for this loop 7557 1726882079.27048: done getting the remaining hosts for this loop 7557 1726882079.27051: getting the next task for host managed_node3 7557 1726882079.27057: done getting next task for host managed_node3 7557 1726882079.27060: ^ task is: TASK: Install iproute 7557 1726882079.27063: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882079.27067: getting variables 7557 1726882079.27068: in VariableManager get_vars() 7557 1726882079.27118: Calling all_inventory to load vars for managed_node3 7557 1726882079.27120: Calling groups_inventory to load vars for managed_node3 7557 1726882079.27122: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882079.27131: Calling all_plugins_play to load vars for managed_node3 7557 1726882079.27133: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882079.27136: Calling groups_plugins_play to load vars for managed_node3 7557 1726882079.27328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882079.27549: done with get_vars() 7557 1726882079.27559: done getting variables 7557 1726882079.27625: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:27:59 -0400 (0:00:00.024) 0:00:05.129 ****** 7557 1726882079.27653: entering _queue_task() for managed_node3/package 7557 1726882079.28010: worker is 1 (out of 1 available) 7557 1726882079.28028: exiting _queue_task() for managed_node3/package 7557 1726882079.28042: done queuing things up, now waiting for results queue to drain 7557 1726882079.28043: waiting for pending results... 7557 1726882079.28216: running TaskExecutor() for managed_node3/TASK: Install iproute 7557 1726882079.28280: in run() - task 12673a56-9f93-ed48-b3a5-0000000003a8 7557 1726882079.28287: variable 'ansible_search_path' from source: unknown 7557 1726882079.28294: variable 'ansible_search_path' from source: unknown 7557 1726882079.28321: calling self._execute() 7557 1726882079.28394: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.28399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.28406: variable 'omit' from source: magic vars 7557 1726882079.28675: variable 'ansible_distribution_major_version' from source: facts 7557 1726882079.28686: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882079.28696: variable 'omit' from source: magic vars 7557 1726882079.28720: variable 'omit' from source: magic vars 7557 1726882079.28849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882079.30598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882079.30602: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882079.30604: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882079.30606: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882079.30609: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882079.30673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882079.30719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882079.30749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882079.30794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882079.30818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882079.30920: variable '__network_is_ostree' from source: set_fact 7557 1726882079.30931: variable 'omit' from source: magic vars 7557 1726882079.30952: variable 'omit' from source: magic vars 7557 1726882079.30974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882079.31004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882079.31012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882079.31025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882079.31033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882079.31055: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882079.31058: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.31060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.31137: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882079.31144: Set connection var ansible_shell_executable to /bin/sh 7557 1726882079.31147: Set connection var ansible_shell_type to sh 7557 1726882079.31151: Set connection var ansible_pipelining to False 7557 1726882079.31153: Set connection var ansible_connection to ssh 7557 1726882079.31158: Set connection var ansible_timeout to 10 7557 1726882079.31175: variable 'ansible_shell_executable' from source: unknown 7557 1726882079.31179: variable 'ansible_connection' from source: unknown 7557 1726882079.31182: variable 'ansible_module_compression' from source: unknown 7557 1726882079.31184: variable 'ansible_shell_type' from source: unknown 7557 1726882079.31187: variable 'ansible_shell_executable' from source: unknown 7557 1726882079.31189: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882079.31195: variable 'ansible_pipelining' from source: unknown 7557 1726882079.31197: variable 'ansible_timeout' from source: unknown 7557 1726882079.31199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882079.31264: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882079.31271: variable 'omit' from source: magic vars 7557 1726882079.31274: starting attempt loop 7557 1726882079.31279: running the handler 7557 1726882079.31286: variable 'ansible_facts' from source: unknown 7557 1726882079.31289: variable 'ansible_facts' from source: unknown 7557 1726882079.31323: _low_level_execute_command(): starting 7557 1726882079.31326: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882079.31788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882079.31792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.31797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882079.31799: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.31851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882079.31857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882079.31859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882079.31920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882079.33817: stdout chunk (state=3): >>>/root <<< 7557 1726882079.33921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882079.33945: stderr chunk (state=3): >>><<< 7557 1726882079.33949: stdout chunk (state=3): >>><<< 7557 1726882079.33971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882079.33983: _low_level_execute_command(): starting 7557 1726882079.33986: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869 `" && echo ansible-tmp-1726882079.339703-7862-136601441714869="` echo /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869 `" ) && sleep 0' 7557 1726882079.34409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882079.34412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.34415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882079.34418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882079.34420: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.34466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882079.34469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882079.34519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882079.36353: stdout chunk (state=3): >>>ansible-tmp-1726882079.339703-7862-136601441714869=/root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869 <<< 7557 1726882079.36451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882079.36476: stderr chunk (state=3): >>><<< 7557 1726882079.36480: stdout chunk (state=3): >>><<< 7557 1726882079.36498: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882079.339703-7862-136601441714869=/root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882079.36517: variable 'ansible_module_compression' from source: unknown 7557 1726882079.36557: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 7557 1726882079.36561: ANSIBALLZ: Acquiring lock 7557 1726882079.36564: ANSIBALLZ: Lock acquired: 140194287013904 7557 1726882079.36566: ANSIBALLZ: Creating module 7557 1726882079.45913: ANSIBALLZ: Writing module into payload 7557 1726882079.46049: ANSIBALLZ: Writing module 7557 1726882079.46071: ANSIBALLZ: Renaming module 7557 1726882079.46087: ANSIBALLZ: Done creating module 7557 1726882079.46107: variable 'ansible_facts' from source: unknown 7557 1726882079.46162: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/AnsiballZ_dnf.py 7557 1726882079.46269: Sending initial data 7557 1726882079.46273: Sent initial data (149 bytes) 7557 1726882079.46740: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882079.46744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.46747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882079.46749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.46806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882079.46810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882079.46812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882079.46872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882079.48417: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882079.48478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882079.48525: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpinsytj9s /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/AnsiballZ_dnf.py <<< 7557 1726882079.48528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/AnsiballZ_dnf.py" <<< 7557 1726882079.48612: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpinsytj9s" to remote "/root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/AnsiballZ_dnf.py" <<< 7557 1726882079.49423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882079.49464: stderr chunk (state=3): >>><<< 7557 1726882079.49467: stdout chunk (state=3): >>><<< 7557 1726882079.49501: done transferring module to remote 7557 1726882079.49510: _low_level_execute_command(): starting 7557 1726882079.49514: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/ /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/AnsiballZ_dnf.py && sleep 0' 7557 1726882079.49945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882079.49949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882079.49951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.49953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882079.49956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882079.49999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882079.50025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882079.50062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882079.51799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882079.51825: stderr chunk (state=3): >>><<< 7557 1726882079.51828: stdout chunk (state=3): >>><<< 7557 1726882079.51845: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882079.51931: _low_level_execute_command(): starting 7557 1726882079.51935: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/AnsiballZ_dnf.py && sleep 0' 7557 1726882079.52579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882079.52600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882079.52642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882079.52701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882082.70313: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7557 1726882082.74511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882082.74538: stdout chunk (state=3): >>><<< 7557 1726882082.74541: stderr chunk (state=3): >>><<< 7557 1726882082.74687: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882082.74703: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882082.74706: _low_level_execute_command(): starting 7557 1726882082.74709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882079.339703-7862-136601441714869/ > /dev/null 2>&1 && sleep 0' 7557 1726882082.75388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882082.75397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882082.75458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882082.75487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882082.75517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882082.75624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882082.77472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882082.77475: stdout chunk (state=3): >>><<< 7557 1726882082.77477: stderr chunk (state=3): >>><<< 7557 1726882082.77699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882082.77703: handler run complete 7557 1726882082.77706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882082.78190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882082.78343: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882082.78698: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882082.78702: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882082.78704: variable '__install_status' from source: unknown 7557 1726882082.78707: Evaluated conditional (__install_status is success): True 7557 1726882082.78709: attempt loop complete, returning result 7557 1726882082.78711: _execute() done 7557 1726882082.78712: dumping result to json 7557 1726882082.78714: done dumping result, returning 7557 1726882082.78716: done running TaskExecutor() for managed_node3/TASK: Install iproute [12673a56-9f93-ed48-b3a5-0000000003a8] 7557 1726882082.79052: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a8 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7557 1726882082.79264: no more pending results, returning what we have 7557 1726882082.79267: results queue empty 7557 1726882082.79268: checking for any_errors_fatal 7557 1726882082.79272: done checking for any_errors_fatal 7557 1726882082.79272: checking for max_fail_percentage 7557 1726882082.79274: done checking for max_fail_percentage 7557 1726882082.79275: checking to see if all hosts have failed and the running result is not ok 7557 1726882082.79276: done checking to see if all hosts have failed 7557 1726882082.79276: getting the remaining hosts for this loop 7557 1726882082.79278: done getting the remaining hosts for this loop 7557 1726882082.79280: getting the next task for host managed_node3 7557 1726882082.79287: done getting next task for host managed_node3 7557 1726882082.79289: ^ task is: TASK: Create veth interface {{ interface }} 7557 1726882082.79292: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882082.79297: getting variables 7557 1726882082.79299: in VariableManager get_vars() 7557 1726882082.79345: Calling all_inventory to load vars for managed_node3 7557 1726882082.79347: Calling groups_inventory to load vars for managed_node3 7557 1726882082.79350: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882082.79360: Calling all_plugins_play to load vars for managed_node3 7557 1726882082.79362: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882082.79365: Calling groups_plugins_play to load vars for managed_node3 7557 1726882082.79570: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a8 7557 1726882082.79573: WORKER PROCESS EXITING 7557 1726882082.79583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882082.79705: done with get_vars() 7557 1726882082.79714: done getting variables 7557 1726882082.79756: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882082.79852: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:28:02 -0400 (0:00:03.522) 0:00:08.651 ****** 7557 1726882082.79883: entering _queue_task() for managed_node3/command 7557 1726882082.80064: worker is 1 (out of 1 available) 7557 1726882082.80077: exiting _queue_task() for managed_node3/command 7557 1726882082.80091: done queuing things up, now waiting for results queue to drain 7557 1726882082.80092: waiting for pending results... 7557 1726882082.80248: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7557 1726882082.80322: in run() - task 12673a56-9f93-ed48-b3a5-0000000003a9 7557 1726882082.80331: variable 'ansible_search_path' from source: unknown 7557 1726882082.80335: variable 'ansible_search_path' from source: unknown 7557 1726882082.80538: variable 'interface' from source: play vars 7557 1726882082.80598: variable 'interface' from source: play vars 7557 1726882082.80649: variable 'interface' from source: play vars 7557 1726882082.80755: Loaded config def from plugin (lookup/items) 7557 1726882082.80761: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7557 1726882082.80777: variable 'omit' from source: magic vars 7557 1726882082.80864: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882082.80867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882082.80878: variable 'omit' from source: magic vars 7557 1726882082.81047: variable 'ansible_distribution_major_version' from source: facts 7557 1726882082.81052: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882082.81398: variable 'type' from source: play vars 7557 1726882082.81401: variable 'state' from source: include params 7557 1726882082.81403: variable 'interface' from source: play vars 7557 1726882082.81406: variable 'current_interfaces' from source: set_fact 7557 1726882082.81408: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7557 1726882082.81410: variable 'omit' from source: magic vars 7557 1726882082.81412: variable 'omit' from source: magic vars 7557 1726882082.81413: variable 'item' from source: unknown 7557 1726882082.81438: variable 'item' from source: unknown 7557 1726882082.81457: variable 'omit' from source: magic vars 7557 1726882082.81490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882082.81531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882082.81553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882082.81577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882082.81598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882082.81634: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882082.81642: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882082.81649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882082.81755: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882082.81770: Set connection var ansible_shell_executable to /bin/sh 7557 1726882082.81778: Set connection var ansible_shell_type to sh 7557 1726882082.81788: Set connection var ansible_pipelining to False 7557 1726882082.81801: Set connection var ansible_connection to ssh 7557 1726882082.81812: Set connection var ansible_timeout to 10 7557 1726882082.81835: variable 'ansible_shell_executable' from source: unknown 7557 1726882082.81843: variable 'ansible_connection' from source: unknown 7557 1726882082.81851: variable 'ansible_module_compression' from source: unknown 7557 1726882082.81857: variable 'ansible_shell_type' from source: unknown 7557 1726882082.81863: variable 'ansible_shell_executable' from source: unknown 7557 1726882082.81869: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882082.81876: variable 'ansible_pipelining' from source: unknown 7557 1726882082.81882: variable 'ansible_timeout' from source: unknown 7557 1726882082.81889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882082.82021: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882082.82036: variable 'omit' from source: magic vars 7557 1726882082.82045: starting attempt loop 7557 1726882082.82052: running the handler 7557 1726882082.82070: _low_level_execute_command(): starting 7557 1726882082.82197: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882082.82672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882082.82690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882082.82704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882082.82741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882082.82768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882082.82815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882082.84410: stdout chunk (state=3): >>>/root <<< 7557 1726882082.84577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882082.84581: stdout chunk (state=3): >>><<< 7557 1726882082.84584: stderr chunk (state=3): >>><<< 7557 1726882082.84587: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882082.84605: _low_level_execute_command(): starting 7557 1726882082.84609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901 `" && echo ansible-tmp-1726882082.845841-7963-261446320666901="` echo /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901 `" ) && sleep 0' 7557 1726882082.85199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882082.85203: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882082.85265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882082.85279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882082.85306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882082.85385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882082.87253: stdout chunk (state=3): >>>ansible-tmp-1726882082.845841-7963-261446320666901=/root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901 <<< 7557 1726882082.87410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882082.87414: stdout chunk (state=3): >>><<< 7557 1726882082.87416: stderr chunk (state=3): >>><<< 7557 1726882082.87430: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882082.845841-7963-261446320666901=/root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882082.87465: variable 'ansible_module_compression' from source: unknown 7557 1726882082.87580: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882082.87583: variable 'ansible_facts' from source: unknown 7557 1726882082.87659: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/AnsiballZ_command.py 7557 1726882082.87865: Sending initial data 7557 1726882082.87876: Sent initial data (153 bytes) 7557 1726882082.88368: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882082.88384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882082.88397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882082.88449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882082.88452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882082.88507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882082.90058: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7557 1726882082.90062: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882082.90104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882082.90154: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp7k4ejw2b /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/AnsiballZ_command.py <<< 7557 1726882082.90159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/AnsiballZ_command.py" <<< 7557 1726882082.90203: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp7k4ejw2b" to remote "/root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/AnsiballZ_command.py" <<< 7557 1726882082.90753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882082.90796: stderr chunk (state=3): >>><<< 7557 1726882082.90800: stdout chunk (state=3): >>><<< 7557 1726882082.90845: done transferring module to remote 7557 1726882082.90853: _low_level_execute_command(): starting 7557 1726882082.90858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/ /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/AnsiballZ_command.py && sleep 0' 7557 1726882082.91282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882082.91288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882082.91290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882082.91300: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882082.91303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882082.91349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882082.91354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882082.91398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882082.93096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882082.93121: stderr chunk (state=3): >>><<< 7557 1726882082.93124: stdout chunk (state=3): >>><<< 7557 1726882082.93136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882082.93138: _low_level_execute_command(): starting 7557 1726882082.93143: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/AnsiballZ_command.py && sleep 0' 7557 1726882082.93541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882082.93545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882082.93556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882082.93616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882082.93623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882082.93669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.13688: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:28:03.086403", "end": "2024-09-20 21:28:03.134331", "delta": "0:00:00.047928", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882083.16271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882083.16286: stderr chunk (state=3): >>><<< 7557 1726882083.16289: stdout chunk (state=3): >>><<< 7557 1726882083.16312: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:28:03.086403", "end": "2024-09-20 21:28:03.134331", "delta": "0:00:00.047928", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882083.16338: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882083.16345: _low_level_execute_command(): starting 7557 1726882083.16349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882082.845841-7963-261446320666901/ > /dev/null 2>&1 && sleep 0' 7557 1726882083.16767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.16775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.16777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.16779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882083.16782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.16822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.16825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.16883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.20974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.20999: stderr chunk (state=3): >>><<< 7557 1726882083.21002: stdout chunk (state=3): >>><<< 7557 1726882083.21013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.21019: handler run complete 7557 1726882083.21036: Evaluated conditional (False): False 7557 1726882083.21044: attempt loop complete, returning result 7557 1726882083.21060: variable 'item' from source: unknown 7557 1726882083.21121: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.047928", "end": "2024-09-20 21:28:03.134331", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:28:03.086403" } 7557 1726882083.21286: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.21289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.21295: variable 'omit' from source: magic vars 7557 1726882083.21382: variable 'ansible_distribution_major_version' from source: facts 7557 1726882083.21386: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882083.21505: variable 'type' from source: play vars 7557 1726882083.21508: variable 'state' from source: include params 7557 1726882083.21511: variable 'interface' from source: play vars 7557 1726882083.21521: variable 'current_interfaces' from source: set_fact 7557 1726882083.21524: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7557 1726882083.21527: variable 'omit' from source: magic vars 7557 1726882083.21539: variable 'omit' from source: magic vars 7557 1726882083.21564: variable 'item' from source: unknown 7557 1726882083.21609: variable 'item' from source: unknown 7557 1726882083.21623: variable 'omit' from source: magic vars 7557 1726882083.21640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882083.21649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882083.21655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882083.21665: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882083.21668: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.21671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.21725: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882083.21730: Set connection var ansible_shell_executable to /bin/sh 7557 1726882083.21733: Set connection var ansible_shell_type to sh 7557 1726882083.21742: Set connection var ansible_pipelining to False 7557 1726882083.21745: Set connection var ansible_connection to ssh 7557 1726882083.21748: Set connection var ansible_timeout to 10 7557 1726882083.21763: variable 'ansible_shell_executable' from source: unknown 7557 1726882083.21766: variable 'ansible_connection' from source: unknown 7557 1726882083.21768: variable 'ansible_module_compression' from source: unknown 7557 1726882083.21771: variable 'ansible_shell_type' from source: unknown 7557 1726882083.21773: variable 'ansible_shell_executable' from source: unknown 7557 1726882083.21775: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.21779: variable 'ansible_pipelining' from source: unknown 7557 1726882083.21781: variable 'ansible_timeout' from source: unknown 7557 1726882083.21785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.21852: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882083.21862: variable 'omit' from source: magic vars 7557 1726882083.21864: starting attempt loop 7557 1726882083.21867: running the handler 7557 1726882083.21873: _low_level_execute_command(): starting 7557 1726882083.21876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882083.22301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.22305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.22307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.22313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.22356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.22359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.22413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.23971: stdout chunk (state=3): >>>/root <<< 7557 1726882083.24070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.24095: stderr chunk (state=3): >>><<< 7557 1726882083.24099: stdout chunk (state=3): >>><<< 7557 1726882083.24109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.24117: _low_level_execute_command(): starting 7557 1726882083.24121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672 `" && echo ansible-tmp-1726882083.241091-7963-64782371227672="` echo /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672 `" ) && sleep 0' 7557 1726882083.24530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882083.24533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882083.24536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.24538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.24540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882083.24542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.24587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.24599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.24641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.26474: stdout chunk (state=3): >>>ansible-tmp-1726882083.241091-7963-64782371227672=/root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672 <<< 7557 1726882083.26577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.26606: stderr chunk (state=3): >>><<< 7557 1726882083.26610: stdout chunk (state=3): >>><<< 7557 1726882083.26625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882083.241091-7963-64782371227672=/root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.26650: variable 'ansible_module_compression' from source: unknown 7557 1726882083.26680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882083.26700: variable 'ansible_facts' from source: unknown 7557 1726882083.26743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/AnsiballZ_command.py 7557 1726882083.26838: Sending initial data 7557 1726882083.26841: Sent initial data (152 bytes) 7557 1726882083.27300: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.27303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882083.27306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.27308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.27310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.27362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.27366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.27370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.27415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.28909: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882083.28953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882083.28998: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp1q56w0mg /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/AnsiballZ_command.py <<< 7557 1726882083.29005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/AnsiballZ_command.py" <<< 7557 1726882083.29048: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp1q56w0mg" to remote "/root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/AnsiballZ_command.py" <<< 7557 1726882083.29050: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/AnsiballZ_command.py" <<< 7557 1726882083.29577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.29624: stderr chunk (state=3): >>><<< 7557 1726882083.29628: stdout chunk (state=3): >>><<< 7557 1726882083.29654: done transferring module to remote 7557 1726882083.29661: _low_level_execute_command(): starting 7557 1726882083.29666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/ /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/AnsiballZ_command.py && sleep 0' 7557 1726882083.30117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.30120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882083.30122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.30125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.30127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882083.30131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.30179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.30183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.30187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.30229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.31908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.31932: stderr chunk (state=3): >>><<< 7557 1726882083.31935: stdout chunk (state=3): >>><<< 7557 1726882083.31951: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.31954: _low_level_execute_command(): starting 7557 1726882083.31958: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/AnsiballZ_command.py && sleep 0' 7557 1726882083.32385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.32388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882083.32390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.32397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.32400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.32448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.32451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.32511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.48051: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:28:03.475176", "end": "2024-09-20 21:28:03.478873", "delta": "0:00:00.003697", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882083.49559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882083.49587: stderr chunk (state=3): >>><<< 7557 1726882083.49595: stdout chunk (state=3): >>><<< 7557 1726882083.49610: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:28:03.475176", "end": "2024-09-20 21:28:03.478873", "delta": "0:00:00.003697", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882083.49636: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882083.49642: _low_level_execute_command(): starting 7557 1726882083.49647: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882083.241091-7963-64782371227672/ > /dev/null 2>&1 && sleep 0' 7557 1726882083.50104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.50107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.50111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882083.50114: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.50116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.50167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.50171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.50173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.50225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.52016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.52044: stderr chunk (state=3): >>><<< 7557 1726882083.52048: stdout chunk (state=3): >>><<< 7557 1726882083.52067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.52072: handler run complete 7557 1726882083.52087: Evaluated conditional (False): False 7557 1726882083.52101: attempt loop complete, returning result 7557 1726882083.52117: variable 'item' from source: unknown 7557 1726882083.52180: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003697", "end": "2024-09-20 21:28:03.478873", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:28:03.475176" } 7557 1726882083.52306: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.52309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.52311: variable 'omit' from source: magic vars 7557 1726882083.52403: variable 'ansible_distribution_major_version' from source: facts 7557 1726882083.52407: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882083.52527: variable 'type' from source: play vars 7557 1726882083.52533: variable 'state' from source: include params 7557 1726882083.52535: variable 'interface' from source: play vars 7557 1726882083.52538: variable 'current_interfaces' from source: set_fact 7557 1726882083.52548: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7557 1726882083.52550: variable 'omit' from source: magic vars 7557 1726882083.52560: variable 'omit' from source: magic vars 7557 1726882083.52584: variable 'item' from source: unknown 7557 1726882083.52630: variable 'item' from source: unknown 7557 1726882083.52641: variable 'omit' from source: magic vars 7557 1726882083.52659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882083.52666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882083.52673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882083.52682: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882083.52684: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.52687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.52739: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882083.52744: Set connection var ansible_shell_executable to /bin/sh 7557 1726882083.52747: Set connection var ansible_shell_type to sh 7557 1726882083.52752: Set connection var ansible_pipelining to False 7557 1726882083.52754: Set connection var ansible_connection to ssh 7557 1726882083.52758: Set connection var ansible_timeout to 10 7557 1726882083.52776: variable 'ansible_shell_executable' from source: unknown 7557 1726882083.52779: variable 'ansible_connection' from source: unknown 7557 1726882083.52781: variable 'ansible_module_compression' from source: unknown 7557 1726882083.52783: variable 'ansible_shell_type' from source: unknown 7557 1726882083.52786: variable 'ansible_shell_executable' from source: unknown 7557 1726882083.52788: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.52790: variable 'ansible_pipelining' from source: unknown 7557 1726882083.52796: variable 'ansible_timeout' from source: unknown 7557 1726882083.52798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.52860: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882083.52868: variable 'omit' from source: magic vars 7557 1726882083.52871: starting attempt loop 7557 1726882083.52873: running the handler 7557 1726882083.52883: _low_level_execute_command(): starting 7557 1726882083.52885: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882083.53344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.53347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.53350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.53352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882083.53354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.53405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.53408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.53411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.53463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.54998: stdout chunk (state=3): >>>/root <<< 7557 1726882083.55097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.55123: stderr chunk (state=3): >>><<< 7557 1726882083.55126: stdout chunk (state=3): >>><<< 7557 1726882083.55138: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.55146: _low_level_execute_command(): starting 7557 1726882083.55151: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172 `" && echo ansible-tmp-1726882083.5513813-7963-84347085639172="` echo /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172 `" ) && sleep 0' 7557 1726882083.55589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.55597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882083.55600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.55602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.55604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.55655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.55662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.55664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.55710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.57544: stdout chunk (state=3): >>>ansible-tmp-1726882083.5513813-7963-84347085639172=/root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172 <<< 7557 1726882083.57647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.57678: stderr chunk (state=3): >>><<< 7557 1726882083.57681: stdout chunk (state=3): >>><<< 7557 1726882083.57697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882083.5513813-7963-84347085639172=/root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.57714: variable 'ansible_module_compression' from source: unknown 7557 1726882083.57742: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882083.57757: variable 'ansible_facts' from source: unknown 7557 1726882083.57806: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/AnsiballZ_command.py 7557 1726882083.57898: Sending initial data 7557 1726882083.57901: Sent initial data (153 bytes) 7557 1726882083.58351: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.58354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.58356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882083.58358: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.58360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.58409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.58415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.58465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.59986: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882083.60029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882083.60080: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp5kvv_saw /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/AnsiballZ_command.py <<< 7557 1726882083.60083: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/AnsiballZ_command.py" <<< 7557 1726882083.60125: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp5kvv_saw" to remote "/root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/AnsiballZ_command.py" <<< 7557 1726882083.60129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/AnsiballZ_command.py" <<< 7557 1726882083.60676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.60723: stderr chunk (state=3): >>><<< 7557 1726882083.60727: stdout chunk (state=3): >>><<< 7557 1726882083.60753: done transferring module to remote 7557 1726882083.60760: _low_level_execute_command(): starting 7557 1726882083.60764: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/ /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/AnsiballZ_command.py && sleep 0' 7557 1726882083.61221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.61224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882083.61227: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.61229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882083.61231: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.61233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.61285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.61290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.61296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.61335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.63016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.63043: stderr chunk (state=3): >>><<< 7557 1726882083.63046: stdout chunk (state=3): >>><<< 7557 1726882083.63059: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.63062: _low_level_execute_command(): starting 7557 1726882083.63067: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/AnsiballZ_command.py && sleep 0' 7557 1726882083.63516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.63519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.63525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.63528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882083.63530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.63583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.63586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.63631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.79065: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:28:03.784976", "end": "2024-09-20 21:28:03.788711", "delta": "0:00:00.003735", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882083.80579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882083.80583: stdout chunk (state=3): >>><<< 7557 1726882083.80586: stderr chunk (state=3): >>><<< 7557 1726882083.80609: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:28:03.784976", "end": "2024-09-20 21:28:03.788711", "delta": "0:00:00.003735", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882083.80730: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882083.80734: _low_level_execute_command(): starting 7557 1726882083.80736: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882083.5513813-7963-84347085639172/ > /dev/null 2>&1 && sleep 0' 7557 1726882083.81300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882083.81316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.81415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.81443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.81459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.81481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.81704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.83481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.83507: stderr chunk (state=3): >>><<< 7557 1726882083.83511: stdout chunk (state=3): >>><<< 7557 1726882083.83527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.83530: handler run complete 7557 1726882083.83546: Evaluated conditional (False): False 7557 1726882083.83553: attempt loop complete, returning result 7557 1726882083.83568: variable 'item' from source: unknown 7557 1726882083.83634: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003735", "end": "2024-09-20 21:28:03.788711", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:28:03.784976" } 7557 1726882083.83749: dumping result to json 7557 1726882083.83752: done dumping result, returning 7557 1726882083.83753: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [12673a56-9f93-ed48-b3a5-0000000003a9] 7557 1726882083.83756: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a9 7557 1726882083.83800: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003a9 7557 1726882083.83802: WORKER PROCESS EXITING 7557 1726882083.83857: no more pending results, returning what we have 7557 1726882083.83861: results queue empty 7557 1726882083.83861: checking for any_errors_fatal 7557 1726882083.83868: done checking for any_errors_fatal 7557 1726882083.83869: checking for max_fail_percentage 7557 1726882083.83870: done checking for max_fail_percentage 7557 1726882083.83871: checking to see if all hosts have failed and the running result is not ok 7557 1726882083.83872: done checking to see if all hosts have failed 7557 1726882083.83872: getting the remaining hosts for this loop 7557 1726882083.83874: done getting the remaining hosts for this loop 7557 1726882083.83876: getting the next task for host managed_node3 7557 1726882083.83882: done getting next task for host managed_node3 7557 1726882083.83885: ^ task is: TASK: Set up veth as managed by NetworkManager 7557 1726882083.83888: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882083.83892: getting variables 7557 1726882083.83895: in VariableManager get_vars() 7557 1726882083.83943: Calling all_inventory to load vars for managed_node3 7557 1726882083.83945: Calling groups_inventory to load vars for managed_node3 7557 1726882083.83947: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882083.83957: Calling all_plugins_play to load vars for managed_node3 7557 1726882083.83959: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882083.83961: Calling groups_plugins_play to load vars for managed_node3 7557 1726882083.84128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882083.84249: done with get_vars() 7557 1726882083.84257: done getting variables 7557 1726882083.84301: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:28:03 -0400 (0:00:01.044) 0:00:09.696 ****** 7557 1726882083.84321: entering _queue_task() for managed_node3/command 7557 1726882083.84513: worker is 1 (out of 1 available) 7557 1726882083.84525: exiting _queue_task() for managed_node3/command 7557 1726882083.84538: done queuing things up, now waiting for results queue to drain 7557 1726882083.84539: waiting for pending results... 7557 1726882083.84694: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7557 1726882083.84757: in run() - task 12673a56-9f93-ed48-b3a5-0000000003aa 7557 1726882083.84770: variable 'ansible_search_path' from source: unknown 7557 1726882083.84774: variable 'ansible_search_path' from source: unknown 7557 1726882083.84807: calling self._execute() 7557 1726882083.84876: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.84879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.84888: variable 'omit' from source: magic vars 7557 1726882083.85145: variable 'ansible_distribution_major_version' from source: facts 7557 1726882083.85154: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882083.85262: variable 'type' from source: play vars 7557 1726882083.85265: variable 'state' from source: include params 7557 1726882083.85272: Evaluated conditional (type == 'veth' and state == 'present'): True 7557 1726882083.85277: variable 'omit' from source: magic vars 7557 1726882083.85309: variable 'omit' from source: magic vars 7557 1726882083.85374: variable 'interface' from source: play vars 7557 1726882083.85387: variable 'omit' from source: magic vars 7557 1726882083.85423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882083.85453: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882083.85467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882083.85480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882083.85491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882083.85517: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882083.85522: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.85524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.85595: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882083.85603: Set connection var ansible_shell_executable to /bin/sh 7557 1726882083.85607: Set connection var ansible_shell_type to sh 7557 1726882083.85611: Set connection var ansible_pipelining to False 7557 1726882083.85614: Set connection var ansible_connection to ssh 7557 1726882083.85620: Set connection var ansible_timeout to 10 7557 1726882083.85639: variable 'ansible_shell_executable' from source: unknown 7557 1726882083.85642: variable 'ansible_connection' from source: unknown 7557 1726882083.85646: variable 'ansible_module_compression' from source: unknown 7557 1726882083.85648: variable 'ansible_shell_type' from source: unknown 7557 1726882083.85651: variable 'ansible_shell_executable' from source: unknown 7557 1726882083.85653: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882083.85656: variable 'ansible_pipelining' from source: unknown 7557 1726882083.85658: variable 'ansible_timeout' from source: unknown 7557 1726882083.85660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882083.85757: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882083.85765: variable 'omit' from source: magic vars 7557 1726882083.85771: starting attempt loop 7557 1726882083.85774: running the handler 7557 1726882083.85787: _low_level_execute_command(): starting 7557 1726882083.85795: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882083.86271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.86309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882083.86313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.86316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.86319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.86350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.86364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.86424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.87999: stdout chunk (state=3): >>>/root <<< 7557 1726882083.88101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.88128: stderr chunk (state=3): >>><<< 7557 1726882083.88131: stdout chunk (state=3): >>><<< 7557 1726882083.88154: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.88164: _low_level_execute_command(): starting 7557 1726882083.88169: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154 `" && echo ansible-tmp-1726882083.8815267-8017-213529888624154="` echo /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154 `" ) && sleep 0' 7557 1726882083.88611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.88614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.88624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.88626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.88628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.88668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.88671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.88722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.90583: stdout chunk (state=3): >>>ansible-tmp-1726882083.8815267-8017-213529888624154=/root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154 <<< 7557 1726882083.90689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.90717: stderr chunk (state=3): >>><<< 7557 1726882083.90720: stdout chunk (state=3): >>><<< 7557 1726882083.90735: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882083.8815267-8017-213529888624154=/root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.90762: variable 'ansible_module_compression' from source: unknown 7557 1726882083.90803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882083.90834: variable 'ansible_facts' from source: unknown 7557 1726882083.90890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/AnsiballZ_command.py 7557 1726882083.90998: Sending initial data 7557 1726882083.91005: Sent initial data (154 bytes) 7557 1726882083.91450: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.91454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882083.91456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.91458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.91460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.91516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.91523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.91526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.91568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.93071: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 7557 1726882083.93075: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882083.93120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882083.93172: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpp1kbxkvx /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/AnsiballZ_command.py <<< 7557 1726882083.93174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/AnsiballZ_command.py" <<< 7557 1726882083.93214: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpp1kbxkvx" to remote "/root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/AnsiballZ_command.py" <<< 7557 1726882083.93217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/AnsiballZ_command.py" <<< 7557 1726882083.93754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.93789: stderr chunk (state=3): >>><<< 7557 1726882083.93792: stdout chunk (state=3): >>><<< 7557 1726882083.93826: done transferring module to remote 7557 1726882083.93835: _low_level_execute_command(): starting 7557 1726882083.93840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/ /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/AnsiballZ_command.py && sleep 0' 7557 1726882083.94261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.94265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882083.94267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.94272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882083.94274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.94319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.94322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.94372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882083.96157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882083.96161: stdout chunk (state=3): >>><<< 7557 1726882083.96163: stderr chunk (state=3): >>><<< 7557 1726882083.96178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882083.96187: _low_level_execute_command(): starting 7557 1726882083.96268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/AnsiballZ_command.py && sleep 0' 7557 1726882083.96819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882083.96844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882083.96860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882083.96877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882083.96900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882083.96914: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882083.96936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.97010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882083.97053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882083.97069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882083.97091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882083.97186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.19681: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:28:04.118465", "end": "2024-09-20 21:28:04.194035", "delta": "0:00:00.075570", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882084.21404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882084.21409: stdout chunk (state=3): >>><<< 7557 1726882084.21412: stderr chunk (state=3): >>><<< 7557 1726882084.21414: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:28:04.118465", "end": "2024-09-20 21:28:04.194035", "delta": "0:00:00.075570", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882084.21418: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882084.21420: _low_level_execute_command(): starting 7557 1726882084.21422: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882083.8815267-8017-213529888624154/ > /dev/null 2>&1 && sleep 0' 7557 1726882084.22013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882084.22031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882084.22058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882084.22087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882084.22171: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882084.22209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882084.22297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.24186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882084.24211: stderr chunk (state=3): >>><<< 7557 1726882084.24227: stdout chunk (state=3): >>><<< 7557 1726882084.24403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882084.24407: handler run complete 7557 1726882084.24409: Evaluated conditional (False): False 7557 1726882084.24411: attempt loop complete, returning result 7557 1726882084.24413: _execute() done 7557 1726882084.24415: dumping result to json 7557 1726882084.24417: done dumping result, returning 7557 1726882084.24420: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-ed48-b3a5-0000000003aa] 7557 1726882084.24422: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003aa ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.075570", "end": "2024-09-20 21:28:04.194035", "rc": 0, "start": "2024-09-20 21:28:04.118465" } 7557 1726882084.24564: no more pending results, returning what we have 7557 1726882084.24567: results queue empty 7557 1726882084.24568: checking for any_errors_fatal 7557 1726882084.24584: done checking for any_errors_fatal 7557 1726882084.24584: checking for max_fail_percentage 7557 1726882084.24586: done checking for max_fail_percentage 7557 1726882084.24587: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.24588: done checking to see if all hosts have failed 7557 1726882084.24588: getting the remaining hosts for this loop 7557 1726882084.24590: done getting the remaining hosts for this loop 7557 1726882084.24612: getting the next task for host managed_node3 7557 1726882084.24619: done getting next task for host managed_node3 7557 1726882084.24622: ^ task is: TASK: Delete veth interface {{ interface }} 7557 1726882084.24625: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.24630: getting variables 7557 1726882084.24632: in VariableManager get_vars() 7557 1726882084.24684: Calling all_inventory to load vars for managed_node3 7557 1726882084.24687: Calling groups_inventory to load vars for managed_node3 7557 1726882084.24690: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.24875: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.24879: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.24882: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.25075: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003aa 7557 1726882084.25078: WORKER PROCESS EXITING 7557 1726882084.25116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.25239: done with get_vars() 7557 1726882084.25252: done getting variables 7557 1726882084.25299: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882084.25381: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:28:04 -0400 (0:00:00.410) 0:00:10.107 ****** 7557 1726882084.25406: entering _queue_task() for managed_node3/command 7557 1726882084.25600: worker is 1 (out of 1 available) 7557 1726882084.25613: exiting _queue_task() for managed_node3/command 7557 1726882084.25626: done queuing things up, now waiting for results queue to drain 7557 1726882084.25627: waiting for pending results... 7557 1726882084.25777: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7557 1726882084.25844: in run() - task 12673a56-9f93-ed48-b3a5-0000000003ab 7557 1726882084.25857: variable 'ansible_search_path' from source: unknown 7557 1726882084.25861: variable 'ansible_search_path' from source: unknown 7557 1726882084.25889: calling self._execute() 7557 1726882084.25957: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.25961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.25971: variable 'omit' from source: magic vars 7557 1726882084.26219: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.26230: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.26359: variable 'type' from source: play vars 7557 1726882084.26362: variable 'state' from source: include params 7557 1726882084.26365: variable 'interface' from source: play vars 7557 1726882084.26370: variable 'current_interfaces' from source: set_fact 7557 1726882084.26378: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7557 1726882084.26381: when evaluation is False, skipping this task 7557 1726882084.26383: _execute() done 7557 1726882084.26386: dumping result to json 7557 1726882084.26388: done dumping result, returning 7557 1726882084.26398: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [12673a56-9f93-ed48-b3a5-0000000003ab] 7557 1726882084.26400: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ab 7557 1726882084.26479: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ab 7557 1726882084.26482: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882084.26553: no more pending results, returning what we have 7557 1726882084.26556: results queue empty 7557 1726882084.26557: checking for any_errors_fatal 7557 1726882084.26564: done checking for any_errors_fatal 7557 1726882084.26565: checking for max_fail_percentage 7557 1726882084.26567: done checking for max_fail_percentage 7557 1726882084.26567: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.26568: done checking to see if all hosts have failed 7557 1726882084.26569: getting the remaining hosts for this loop 7557 1726882084.26570: done getting the remaining hosts for this loop 7557 1726882084.26572: getting the next task for host managed_node3 7557 1726882084.26577: done getting next task for host managed_node3 7557 1726882084.26579: ^ task is: TASK: Create dummy interface {{ interface }} 7557 1726882084.26582: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.26586: getting variables 7557 1726882084.26587: in VariableManager get_vars() 7557 1726882084.26629: Calling all_inventory to load vars for managed_node3 7557 1726882084.26631: Calling groups_inventory to load vars for managed_node3 7557 1726882084.26633: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.26639: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.26641: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.26643: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.26767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.26974: done with get_vars() 7557 1726882084.26983: done getting variables 7557 1726882084.27121: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882084.27228: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:28:04 -0400 (0:00:00.018) 0:00:10.125 ****** 7557 1726882084.27257: entering _queue_task() for managed_node3/command 7557 1726882084.27721: worker is 1 (out of 1 available) 7557 1726882084.27728: exiting _queue_task() for managed_node3/command 7557 1726882084.27739: done queuing things up, now waiting for results queue to drain 7557 1726882084.27740: waiting for pending results... 7557 1726882084.27867: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7557 1726882084.27881: in run() - task 12673a56-9f93-ed48-b3a5-0000000003ac 7557 1726882084.27906: variable 'ansible_search_path' from source: unknown 7557 1726882084.27914: variable 'ansible_search_path' from source: unknown 7557 1726882084.27955: calling self._execute() 7557 1726882084.28061: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.28079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.28086: variable 'omit' from source: magic vars 7557 1726882084.28354: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.28364: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.28498: variable 'type' from source: play vars 7557 1726882084.28511: variable 'state' from source: include params 7557 1726882084.28514: variable 'interface' from source: play vars 7557 1726882084.28517: variable 'current_interfaces' from source: set_fact 7557 1726882084.28520: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7557 1726882084.28523: when evaluation is False, skipping this task 7557 1726882084.28525: _execute() done 7557 1726882084.28527: dumping result to json 7557 1726882084.28530: done dumping result, returning 7557 1726882084.28537: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [12673a56-9f93-ed48-b3a5-0000000003ac] 7557 1726882084.28542: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ac 7557 1726882084.28618: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ac 7557 1726882084.28621: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882084.28666: no more pending results, returning what we have 7557 1726882084.28670: results queue empty 7557 1726882084.28671: checking for any_errors_fatal 7557 1726882084.28678: done checking for any_errors_fatal 7557 1726882084.28679: checking for max_fail_percentage 7557 1726882084.28681: done checking for max_fail_percentage 7557 1726882084.28681: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.28682: done checking to see if all hosts have failed 7557 1726882084.28683: getting the remaining hosts for this loop 7557 1726882084.28684: done getting the remaining hosts for this loop 7557 1726882084.28687: getting the next task for host managed_node3 7557 1726882084.28692: done getting next task for host managed_node3 7557 1726882084.28697: ^ task is: TASK: Delete dummy interface {{ interface }} 7557 1726882084.28700: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.28703: getting variables 7557 1726882084.28704: in VariableManager get_vars() 7557 1726882084.28746: Calling all_inventory to load vars for managed_node3 7557 1726882084.28748: Calling groups_inventory to load vars for managed_node3 7557 1726882084.28750: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.28759: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.28762: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.28764: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.28880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.28997: done with get_vars() 7557 1726882084.29005: done getting variables 7557 1726882084.29044: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882084.29118: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:28:04 -0400 (0:00:00.018) 0:00:10.144 ****** 7557 1726882084.29139: entering _queue_task() for managed_node3/command 7557 1726882084.29315: worker is 1 (out of 1 available) 7557 1726882084.29328: exiting _queue_task() for managed_node3/command 7557 1726882084.29342: done queuing things up, now waiting for results queue to drain 7557 1726882084.29344: waiting for pending results... 7557 1726882084.29489: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7557 1726882084.29542: in run() - task 12673a56-9f93-ed48-b3a5-0000000003ad 7557 1726882084.29553: variable 'ansible_search_path' from source: unknown 7557 1726882084.29557: variable 'ansible_search_path' from source: unknown 7557 1726882084.29586: calling self._execute() 7557 1726882084.29653: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.29657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.29666: variable 'omit' from source: magic vars 7557 1726882084.29935: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.29942: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.30301: variable 'type' from source: play vars 7557 1726882084.30304: variable 'state' from source: include params 7557 1726882084.30307: variable 'interface' from source: play vars 7557 1726882084.30310: variable 'current_interfaces' from source: set_fact 7557 1726882084.30313: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7557 1726882084.30316: when evaluation is False, skipping this task 7557 1726882084.30319: _execute() done 7557 1726882084.30321: dumping result to json 7557 1726882084.30323: done dumping result, returning 7557 1726882084.30325: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [12673a56-9f93-ed48-b3a5-0000000003ad] 7557 1726882084.30327: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ad 7557 1726882084.30386: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ad 7557 1726882084.30389: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882084.30442: no more pending results, returning what we have 7557 1726882084.30445: results queue empty 7557 1726882084.30445: checking for any_errors_fatal 7557 1726882084.30450: done checking for any_errors_fatal 7557 1726882084.30451: checking for max_fail_percentage 7557 1726882084.30452: done checking for max_fail_percentage 7557 1726882084.30453: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.30454: done checking to see if all hosts have failed 7557 1726882084.30454: getting the remaining hosts for this loop 7557 1726882084.30455: done getting the remaining hosts for this loop 7557 1726882084.30458: getting the next task for host managed_node3 7557 1726882084.30463: done getting next task for host managed_node3 7557 1726882084.30465: ^ task is: TASK: Create tap interface {{ interface }} 7557 1726882084.30469: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.30472: getting variables 7557 1726882084.30473: in VariableManager get_vars() 7557 1726882084.30522: Calling all_inventory to load vars for managed_node3 7557 1726882084.30525: Calling groups_inventory to load vars for managed_node3 7557 1726882084.30527: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.30536: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.30539: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.30542: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.30770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.30981: done with get_vars() 7557 1726882084.30991: done getting variables 7557 1726882084.31047: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882084.31153: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:28:04 -0400 (0:00:00.020) 0:00:10.164 ****** 7557 1726882084.31190: entering _queue_task() for managed_node3/command 7557 1726882084.31452: worker is 1 (out of 1 available) 7557 1726882084.31465: exiting _queue_task() for managed_node3/command 7557 1726882084.31477: done queuing things up, now waiting for results queue to drain 7557 1726882084.31479: waiting for pending results... 7557 1726882084.31754: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7557 1726882084.31867: in run() - task 12673a56-9f93-ed48-b3a5-0000000003ae 7557 1726882084.31888: variable 'ansible_search_path' from source: unknown 7557 1726882084.31899: variable 'ansible_search_path' from source: unknown 7557 1726882084.31949: calling self._execute() 7557 1726882084.32051: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.32063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.32081: variable 'omit' from source: magic vars 7557 1726882084.32472: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.32488: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.32698: variable 'type' from source: play vars 7557 1726882084.32709: variable 'state' from source: include params 7557 1726882084.32718: variable 'interface' from source: play vars 7557 1726882084.32796: variable 'current_interfaces' from source: set_fact 7557 1726882084.32800: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7557 1726882084.32803: when evaluation is False, skipping this task 7557 1726882084.32805: _execute() done 7557 1726882084.32807: dumping result to json 7557 1726882084.32809: done dumping result, returning 7557 1726882084.32811: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [12673a56-9f93-ed48-b3a5-0000000003ae] 7557 1726882084.32813: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ae 7557 1726882084.32877: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003ae 7557 1726882084.32881: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882084.32947: no more pending results, returning what we have 7557 1726882084.32951: results queue empty 7557 1726882084.32952: checking for any_errors_fatal 7557 1726882084.32958: done checking for any_errors_fatal 7557 1726882084.32959: checking for max_fail_percentage 7557 1726882084.32961: done checking for max_fail_percentage 7557 1726882084.32962: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.32963: done checking to see if all hosts have failed 7557 1726882084.32964: getting the remaining hosts for this loop 7557 1726882084.32965: done getting the remaining hosts for this loop 7557 1726882084.32969: getting the next task for host managed_node3 7557 1726882084.32975: done getting next task for host managed_node3 7557 1726882084.32978: ^ task is: TASK: Delete tap interface {{ interface }} 7557 1726882084.32982: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.32986: getting variables 7557 1726882084.32988: in VariableManager get_vars() 7557 1726882084.33233: Calling all_inventory to load vars for managed_node3 7557 1726882084.33236: Calling groups_inventory to load vars for managed_node3 7557 1726882084.33239: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.33249: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.33251: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.33254: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.33487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.33702: done with get_vars() 7557 1726882084.33713: done getting variables 7557 1726882084.33777: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882084.33892: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:28:04 -0400 (0:00:00.027) 0:00:10.192 ****** 7557 1726882084.33924: entering _queue_task() for managed_node3/command 7557 1726882084.34298: worker is 1 (out of 1 available) 7557 1726882084.34310: exiting _queue_task() for managed_node3/command 7557 1726882084.34321: done queuing things up, now waiting for results queue to drain 7557 1726882084.34323: waiting for pending results... 7557 1726882084.34561: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7557 1726882084.34589: in run() - task 12673a56-9f93-ed48-b3a5-0000000003af 7557 1726882084.34616: variable 'ansible_search_path' from source: unknown 7557 1726882084.34625: variable 'ansible_search_path' from source: unknown 7557 1726882084.34718: calling self._execute() 7557 1726882084.34776: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.34790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.34807: variable 'omit' from source: magic vars 7557 1726882084.35182: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.35208: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.35427: variable 'type' from source: play vars 7557 1726882084.35480: variable 'state' from source: include params 7557 1726882084.35485: variable 'interface' from source: play vars 7557 1726882084.35487: variable 'current_interfaces' from source: set_fact 7557 1726882084.35490: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7557 1726882084.35492: when evaluation is False, skipping this task 7557 1726882084.35496: _execute() done 7557 1726882084.35498: dumping result to json 7557 1726882084.35500: done dumping result, returning 7557 1726882084.35502: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [12673a56-9f93-ed48-b3a5-0000000003af] 7557 1726882084.35505: sending task result for task 12673a56-9f93-ed48-b3a5-0000000003af skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882084.35744: no more pending results, returning what we have 7557 1726882084.35748: results queue empty 7557 1726882084.35749: checking for any_errors_fatal 7557 1726882084.35757: done checking for any_errors_fatal 7557 1726882084.35757: checking for max_fail_percentage 7557 1726882084.35759: done checking for max_fail_percentage 7557 1726882084.35759: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.35760: done checking to see if all hosts have failed 7557 1726882084.35761: getting the remaining hosts for this loop 7557 1726882084.35763: done getting the remaining hosts for this loop 7557 1726882084.35766: getting the next task for host managed_node3 7557 1726882084.35774: done getting next task for host managed_node3 7557 1726882084.35776: ^ task is: TASK: Include the task 'assert_device_present.yml' 7557 1726882084.35779: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.35784: getting variables 7557 1726882084.35785: in VariableManager get_vars() 7557 1726882084.35945: Calling all_inventory to load vars for managed_node3 7557 1726882084.35948: Calling groups_inventory to load vars for managed_node3 7557 1726882084.35950: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.35960: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.35962: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.35965: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.36246: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000003af 7557 1726882084.36249: WORKER PROCESS EXITING 7557 1726882084.36271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.36469: done with get_vars() 7557 1726882084.36479: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:15 Friday 20 September 2024 21:28:04 -0400 (0:00:00.026) 0:00:10.218 ****** 7557 1726882084.36577: entering _queue_task() for managed_node3/include_tasks 7557 1726882084.36837: worker is 1 (out of 1 available) 7557 1726882084.36849: exiting _queue_task() for managed_node3/include_tasks 7557 1726882084.36977: done queuing things up, now waiting for results queue to drain 7557 1726882084.36979: waiting for pending results... 7557 1726882084.37132: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7557 1726882084.37236: in run() - task 12673a56-9f93-ed48-b3a5-00000000000d 7557 1726882084.37257: variable 'ansible_search_path' from source: unknown 7557 1726882084.37300: calling self._execute() 7557 1726882084.37402: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.37421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.37440: variable 'omit' from source: magic vars 7557 1726882084.37819: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.37837: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.37856: _execute() done 7557 1726882084.37870: dumping result to json 7557 1726882084.37898: done dumping result, returning 7557 1726882084.37902: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [12673a56-9f93-ed48-b3a5-00000000000d] 7557 1726882084.37904: sending task result for task 12673a56-9f93-ed48-b3a5-00000000000d 7557 1726882084.38141: no more pending results, returning what we have 7557 1726882084.38146: in VariableManager get_vars() 7557 1726882084.38206: Calling all_inventory to load vars for managed_node3 7557 1726882084.38209: Calling groups_inventory to load vars for managed_node3 7557 1726882084.38212: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.38228: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.38231: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.38235: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.38534: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000000d 7557 1726882084.38538: WORKER PROCESS EXITING 7557 1726882084.38560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.38769: done with get_vars() 7557 1726882084.38777: variable 'ansible_search_path' from source: unknown 7557 1726882084.38789: we have included files to process 7557 1726882084.38790: generating all_blocks data 7557 1726882084.38792: done generating all_blocks data 7557 1726882084.38796: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882084.38797: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882084.38800: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882084.38968: in VariableManager get_vars() 7557 1726882084.38995: done with get_vars() 7557 1726882084.39109: done processing included file 7557 1726882084.39112: iterating over new_blocks loaded from include file 7557 1726882084.39113: in VariableManager get_vars() 7557 1726882084.39134: done with get_vars() 7557 1726882084.39135: filtering new block on tags 7557 1726882084.39151: done filtering new block on tags 7557 1726882084.39158: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7557 1726882084.39167: extending task lists for all hosts with included blocks 7557 1726882084.44727: done extending task lists 7557 1726882084.44729: done processing included files 7557 1726882084.44730: results queue empty 7557 1726882084.44731: checking for any_errors_fatal 7557 1726882084.44734: done checking for any_errors_fatal 7557 1726882084.44735: checking for max_fail_percentage 7557 1726882084.44740: done checking for max_fail_percentage 7557 1726882084.44741: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.44742: done checking to see if all hosts have failed 7557 1726882084.44743: getting the remaining hosts for this loop 7557 1726882084.44744: done getting the remaining hosts for this loop 7557 1726882084.44747: getting the next task for host managed_node3 7557 1726882084.44751: done getting next task for host managed_node3 7557 1726882084.44753: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7557 1726882084.44756: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.44758: getting variables 7557 1726882084.44760: in VariableManager get_vars() 7557 1726882084.44783: Calling all_inventory to load vars for managed_node3 7557 1726882084.44790: Calling groups_inventory to load vars for managed_node3 7557 1726882084.44794: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.44801: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.44804: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.44807: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.44965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.45159: done with get_vars() 7557 1726882084.45175: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:28:04 -0400 (0:00:00.086) 0:00:10.305 ****** 7557 1726882084.45258: entering _queue_task() for managed_node3/include_tasks 7557 1726882084.45805: worker is 1 (out of 1 available) 7557 1726882084.45812: exiting _queue_task() for managed_node3/include_tasks 7557 1726882084.45823: done queuing things up, now waiting for results queue to drain 7557 1726882084.45824: waiting for pending results... 7557 1726882084.45883: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7557 1726882084.45996: in run() - task 12673a56-9f93-ed48-b3a5-0000000005f5 7557 1726882084.46054: variable 'ansible_search_path' from source: unknown 7557 1726882084.46057: variable 'ansible_search_path' from source: unknown 7557 1726882084.46072: calling self._execute() 7557 1726882084.46177: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.46183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.46191: variable 'omit' from source: magic vars 7557 1726882084.46501: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.46512: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.46517: _execute() done 7557 1726882084.46520: dumping result to json 7557 1726882084.46523: done dumping result, returning 7557 1726882084.46529: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-ed48-b3a5-0000000005f5] 7557 1726882084.46534: sending task result for task 12673a56-9f93-ed48-b3a5-0000000005f5 7557 1726882084.46612: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000005f5 7557 1726882084.46615: WORKER PROCESS EXITING 7557 1726882084.46643: no more pending results, returning what we have 7557 1726882084.46648: in VariableManager get_vars() 7557 1726882084.46702: Calling all_inventory to load vars for managed_node3 7557 1726882084.46705: Calling groups_inventory to load vars for managed_node3 7557 1726882084.46708: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.46719: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.46722: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.46726: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.46877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.46991: done with get_vars() 7557 1726882084.46999: variable 'ansible_search_path' from source: unknown 7557 1726882084.47000: variable 'ansible_search_path' from source: unknown 7557 1726882084.47027: we have included files to process 7557 1726882084.47028: generating all_blocks data 7557 1726882084.47028: done generating all_blocks data 7557 1726882084.47029: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882084.47030: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882084.47031: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882084.47178: done processing included file 7557 1726882084.47179: iterating over new_blocks loaded from include file 7557 1726882084.47180: in VariableManager get_vars() 7557 1726882084.47198: done with get_vars() 7557 1726882084.47199: filtering new block on tags 7557 1726882084.47208: done filtering new block on tags 7557 1726882084.47210: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7557 1726882084.47213: extending task lists for all hosts with included blocks 7557 1726882084.47271: done extending task lists 7557 1726882084.47272: done processing included files 7557 1726882084.47273: results queue empty 7557 1726882084.47273: checking for any_errors_fatal 7557 1726882084.47276: done checking for any_errors_fatal 7557 1726882084.47276: checking for max_fail_percentage 7557 1726882084.47277: done checking for max_fail_percentage 7557 1726882084.47277: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.47278: done checking to see if all hosts have failed 7557 1726882084.47278: getting the remaining hosts for this loop 7557 1726882084.47279: done getting the remaining hosts for this loop 7557 1726882084.47281: getting the next task for host managed_node3 7557 1726882084.47283: done getting next task for host managed_node3 7557 1726882084.47286: ^ task is: TASK: Get stat for interface {{ interface }} 7557 1726882084.47288: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.47289: getting variables 7557 1726882084.47290: in VariableManager get_vars() 7557 1726882084.47303: Calling all_inventory to load vars for managed_node3 7557 1726882084.47304: Calling groups_inventory to load vars for managed_node3 7557 1726882084.47306: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.47309: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.47310: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.47312: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.47392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.47518: done with get_vars() 7557 1726882084.47524: done getting variables 7557 1726882084.47628: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:28:04 -0400 (0:00:00.023) 0:00:10.329 ****** 7557 1726882084.47657: entering _queue_task() for managed_node3/stat 7557 1726882084.47904: worker is 1 (out of 1 available) 7557 1726882084.47918: exiting _queue_task() for managed_node3/stat 7557 1726882084.47930: done queuing things up, now waiting for results queue to drain 7557 1726882084.47931: waiting for pending results... 7557 1726882084.48213: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7557 1726882084.48218: in run() - task 12673a56-9f93-ed48-b3a5-0000000007ee 7557 1726882084.48226: variable 'ansible_search_path' from source: unknown 7557 1726882084.48230: variable 'ansible_search_path' from source: unknown 7557 1726882084.48262: calling self._execute() 7557 1726882084.48357: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.48363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.48373: variable 'omit' from source: magic vars 7557 1726882084.48726: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.48737: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.48747: variable 'omit' from source: magic vars 7557 1726882084.48784: variable 'omit' from source: magic vars 7557 1726882084.48973: variable 'interface' from source: play vars 7557 1726882084.48977: variable 'omit' from source: magic vars 7557 1726882084.48979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882084.48982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882084.48984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882084.48992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882084.49014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882084.49198: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882084.49202: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.49205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.49207: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882084.49209: Set connection var ansible_shell_executable to /bin/sh 7557 1726882084.49212: Set connection var ansible_shell_type to sh 7557 1726882084.49214: Set connection var ansible_pipelining to False 7557 1726882084.49216: Set connection var ansible_connection to ssh 7557 1726882084.49220: Set connection var ansible_timeout to 10 7557 1726882084.49222: variable 'ansible_shell_executable' from source: unknown 7557 1726882084.49224: variable 'ansible_connection' from source: unknown 7557 1726882084.49226: variable 'ansible_module_compression' from source: unknown 7557 1726882084.49228: variable 'ansible_shell_type' from source: unknown 7557 1726882084.49230: variable 'ansible_shell_executable' from source: unknown 7557 1726882084.49232: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.49234: variable 'ansible_pipelining' from source: unknown 7557 1726882084.49236: variable 'ansible_timeout' from source: unknown 7557 1726882084.49238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.49468: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882084.49473: variable 'omit' from source: magic vars 7557 1726882084.49476: starting attempt loop 7557 1726882084.49478: running the handler 7557 1726882084.49480: _low_level_execute_command(): starting 7557 1726882084.49482: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882084.50029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882084.50038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.50045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882084.50073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882084.50076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882084.50078: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.50081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.50143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882084.50146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882084.50197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.51916: stdout chunk (state=3): >>>/root <<< 7557 1726882084.52192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882084.52200: stdout chunk (state=3): >>><<< 7557 1726882084.52203: stderr chunk (state=3): >>><<< 7557 1726882084.52206: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882084.52209: _low_level_execute_command(): starting 7557 1726882084.52212: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747 `" && echo ansible-tmp-1726882084.5203893-8043-136517422221747="` echo /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747 `" ) && sleep 0' 7557 1726882084.52635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882084.52639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882084.52669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882084.52679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.52725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882084.52729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882084.52784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.54680: stdout chunk (state=3): >>>ansible-tmp-1726882084.5203893-8043-136517422221747=/root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747 <<< 7557 1726882084.54797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882084.54826: stderr chunk (state=3): >>><<< 7557 1726882084.54829: stdout chunk (state=3): >>><<< 7557 1726882084.54846: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882084.5203893-8043-136517422221747=/root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882084.54883: variable 'ansible_module_compression' from source: unknown 7557 1726882084.54933: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7557 1726882084.54961: variable 'ansible_facts' from source: unknown 7557 1726882084.55028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/AnsiballZ_stat.py 7557 1726882084.55127: Sending initial data 7557 1726882084.55130: Sent initial data (151 bytes) 7557 1726882084.55572: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.55575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882084.55578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.55580: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.55582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.55637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882084.55641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882084.55692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.57212: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882084.57255: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882084.57303: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsxciuf27 /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/AnsiballZ_stat.py <<< 7557 1726882084.57309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/AnsiballZ_stat.py" <<< 7557 1726882084.57348: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsxciuf27" to remote "/root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/AnsiballZ_stat.py" <<< 7557 1726882084.57351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/AnsiballZ_stat.py" <<< 7557 1726882084.57887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882084.57927: stderr chunk (state=3): >>><<< 7557 1726882084.57930: stdout chunk (state=3): >>><<< 7557 1726882084.57953: done transferring module to remote 7557 1726882084.57961: _low_level_execute_command(): starting 7557 1726882084.57970: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/ /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/AnsiballZ_stat.py && sleep 0' 7557 1726882084.58375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.58403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882084.58407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882084.58412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.58415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882084.58422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.58437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.58479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882084.58483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882084.58536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.60238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882084.60263: stderr chunk (state=3): >>><<< 7557 1726882084.60267: stdout chunk (state=3): >>><<< 7557 1726882084.60280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882084.60283: _low_level_execute_command(): starting 7557 1726882084.60288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/AnsiballZ_stat.py && sleep 0' 7557 1726882084.60732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.60735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882084.60737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882084.60739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.60741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.60789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882084.60805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882084.60849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.75875: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726882083.1308763, "mtime": 1726882083.1308763, "ctime": 1726882083.1308763, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7557 1726882084.77164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882084.77189: stderr chunk (state=3): >>><<< 7557 1726882084.77197: stdout chunk (state=3): >>><<< 7557 1726882084.77211: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726882083.1308763, "mtime": 1726882083.1308763, "ctime": 1726882083.1308763, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882084.77249: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882084.77262: _low_level_execute_command(): starting 7557 1726882084.77265: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882084.5203893-8043-136517422221747/ > /dev/null 2>&1 && sleep 0' 7557 1726882084.77740: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882084.77743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.77746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882084.77748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882084.77750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882084.77799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882084.77804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882084.77810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882084.77857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882084.79635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882084.79658: stderr chunk (state=3): >>><<< 7557 1726882084.79661: stdout chunk (state=3): >>><<< 7557 1726882084.79677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882084.79682: handler run complete 7557 1726882084.79717: attempt loop complete, returning result 7557 1726882084.79720: _execute() done 7557 1726882084.79723: dumping result to json 7557 1726882084.79727: done dumping result, returning 7557 1726882084.79735: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [12673a56-9f93-ed48-b3a5-0000000007ee] 7557 1726882084.79743: sending task result for task 12673a56-9f93-ed48-b3a5-0000000007ee 7557 1726882084.79842: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000007ee 7557 1726882084.79844: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882083.1308763, "block_size": 4096, "blocks": 0, "ctime": 1726882083.1308763, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25123, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882083.1308763, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7557 1726882084.79930: no more pending results, returning what we have 7557 1726882084.79934: results queue empty 7557 1726882084.79935: checking for any_errors_fatal 7557 1726882084.79936: done checking for any_errors_fatal 7557 1726882084.79937: checking for max_fail_percentage 7557 1726882084.79939: done checking for max_fail_percentage 7557 1726882084.79939: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.79940: done checking to see if all hosts have failed 7557 1726882084.79941: getting the remaining hosts for this loop 7557 1726882084.79943: done getting the remaining hosts for this loop 7557 1726882084.79946: getting the next task for host managed_node3 7557 1726882084.79954: done getting next task for host managed_node3 7557 1726882084.79957: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7557 1726882084.79959: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.79963: getting variables 7557 1726882084.79964: in VariableManager get_vars() 7557 1726882084.80016: Calling all_inventory to load vars for managed_node3 7557 1726882084.80019: Calling groups_inventory to load vars for managed_node3 7557 1726882084.80022: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.80031: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.80033: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.80036: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.80161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.80279: done with get_vars() 7557 1726882084.80288: done getting variables 7557 1726882084.80361: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 7557 1726882084.80449: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:28:04 -0400 (0:00:00.328) 0:00:10.657 ****** 7557 1726882084.80471: entering _queue_task() for managed_node3/assert 7557 1726882084.80476: Creating lock for assert 7557 1726882084.80667: worker is 1 (out of 1 available) 7557 1726882084.80679: exiting _queue_task() for managed_node3/assert 7557 1726882084.80692: done queuing things up, now waiting for results queue to drain 7557 1726882084.80695: waiting for pending results... 7557 1726882084.80862: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7557 1726882084.80927: in run() - task 12673a56-9f93-ed48-b3a5-0000000005f6 7557 1726882084.80939: variable 'ansible_search_path' from source: unknown 7557 1726882084.80943: variable 'ansible_search_path' from source: unknown 7557 1726882084.80969: calling self._execute() 7557 1726882084.81039: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.81043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.81052: variable 'omit' from source: magic vars 7557 1726882084.81579: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.81589: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.81592: variable 'omit' from source: magic vars 7557 1726882084.81621: variable 'omit' from source: magic vars 7557 1726882084.81685: variable 'interface' from source: play vars 7557 1726882084.81704: variable 'omit' from source: magic vars 7557 1726882084.81733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882084.81758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882084.81773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882084.81786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882084.81801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882084.81825: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882084.81829: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.81831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.81901: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882084.81905: Set connection var ansible_shell_executable to /bin/sh 7557 1726882084.81911: Set connection var ansible_shell_type to sh 7557 1726882084.81915: Set connection var ansible_pipelining to False 7557 1726882084.81918: Set connection var ansible_connection to ssh 7557 1726882084.81924: Set connection var ansible_timeout to 10 7557 1726882084.81940: variable 'ansible_shell_executable' from source: unknown 7557 1726882084.81944: variable 'ansible_connection' from source: unknown 7557 1726882084.81946: variable 'ansible_module_compression' from source: unknown 7557 1726882084.81948: variable 'ansible_shell_type' from source: unknown 7557 1726882084.81950: variable 'ansible_shell_executable' from source: unknown 7557 1726882084.81953: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.81955: variable 'ansible_pipelining' from source: unknown 7557 1726882084.81959: variable 'ansible_timeout' from source: unknown 7557 1726882084.81962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.82063: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882084.82071: variable 'omit' from source: magic vars 7557 1726882084.82074: starting attempt loop 7557 1726882084.82079: running the handler 7557 1726882084.82169: variable 'interface_stat' from source: set_fact 7557 1726882084.82183: Evaluated conditional (interface_stat.stat.exists): True 7557 1726882084.82188: handler run complete 7557 1726882084.82202: attempt loop complete, returning result 7557 1726882084.82205: _execute() done 7557 1726882084.82208: dumping result to json 7557 1726882084.82210: done dumping result, returning 7557 1726882084.82215: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [12673a56-9f93-ed48-b3a5-0000000005f6] 7557 1726882084.82222: sending task result for task 12673a56-9f93-ed48-b3a5-0000000005f6 7557 1726882084.82300: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000005f6 7557 1726882084.82302: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882084.82372: no more pending results, returning what we have 7557 1726882084.82375: results queue empty 7557 1726882084.82376: checking for any_errors_fatal 7557 1726882084.82382: done checking for any_errors_fatal 7557 1726882084.82382: checking for max_fail_percentage 7557 1726882084.82384: done checking for max_fail_percentage 7557 1726882084.82384: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.82385: done checking to see if all hosts have failed 7557 1726882084.82386: getting the remaining hosts for this loop 7557 1726882084.82387: done getting the remaining hosts for this loop 7557 1726882084.82390: getting the next task for host managed_node3 7557 1726882084.82398: done getting next task for host managed_node3 7557 1726882084.82400: ^ task is: TASK: TEST: I can configure an interface with auto_gateway enabled 7557 1726882084.82403: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.82407: getting variables 7557 1726882084.82408: in VariableManager get_vars() 7557 1726882084.82447: Calling all_inventory to load vars for managed_node3 7557 1726882084.82450: Calling groups_inventory to load vars for managed_node3 7557 1726882084.82452: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.82460: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.82463: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.82465: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.82881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.83062: done with get_vars() 7557 1726882084.83072: done getting variables 7557 1726882084.83129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway enabled] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:17 Friday 20 September 2024 21:28:04 -0400 (0:00:00.026) 0:00:10.684 ****** 7557 1726882084.83154: entering _queue_task() for managed_node3/debug 7557 1726882084.83386: worker is 1 (out of 1 available) 7557 1726882084.83399: exiting _queue_task() for managed_node3/debug 7557 1726882084.83414: done queuing things up, now waiting for results queue to drain 7557 1726882084.83415: waiting for pending results... 7557 1726882084.83823: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled 7557 1726882084.83828: in run() - task 12673a56-9f93-ed48-b3a5-00000000000e 7557 1726882084.83831: variable 'ansible_search_path' from source: unknown 7557 1726882084.83898: calling self._execute() 7557 1726882084.84004: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.84022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.84042: variable 'omit' from source: magic vars 7557 1726882084.84424: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.84466: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.84469: variable 'omit' from source: magic vars 7557 1726882084.84477: variable 'omit' from source: magic vars 7557 1726882084.84522: variable 'omit' from source: magic vars 7557 1726882084.84576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882084.84686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882084.84689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882084.84692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882084.84697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882084.84710: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882084.84720: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.84729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.84847: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882084.84862: Set connection var ansible_shell_executable to /bin/sh 7557 1726882084.84871: Set connection var ansible_shell_type to sh 7557 1726882084.84883: Set connection var ansible_pipelining to False 7557 1726882084.84891: Set connection var ansible_connection to ssh 7557 1726882084.84908: Set connection var ansible_timeout to 10 7557 1726882084.84936: variable 'ansible_shell_executable' from source: unknown 7557 1726882084.84946: variable 'ansible_connection' from source: unknown 7557 1726882084.84954: variable 'ansible_module_compression' from source: unknown 7557 1726882084.85009: variable 'ansible_shell_type' from source: unknown 7557 1726882084.85012: variable 'ansible_shell_executable' from source: unknown 7557 1726882084.85015: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.85017: variable 'ansible_pipelining' from source: unknown 7557 1726882084.85019: variable 'ansible_timeout' from source: unknown 7557 1726882084.85021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.85147: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882084.85328: variable 'omit' from source: magic vars 7557 1726882084.85561: starting attempt loop 7557 1726882084.85565: running the handler 7557 1726882084.85568: handler run complete 7557 1726882084.85570: attempt loop complete, returning result 7557 1726882084.85572: _execute() done 7557 1726882084.85574: dumping result to json 7557 1726882084.85576: done dumping result, returning 7557 1726882084.85578: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled [12673a56-9f93-ed48-b3a5-00000000000e] 7557 1726882084.85580: sending task result for task 12673a56-9f93-ed48-b3a5-00000000000e ok: [managed_node3] => {} MSG: ################################################## 7557 1726882084.85776: no more pending results, returning what we have 7557 1726882084.85780: results queue empty 7557 1726882084.85782: checking for any_errors_fatal 7557 1726882084.85788: done checking for any_errors_fatal 7557 1726882084.85789: checking for max_fail_percentage 7557 1726882084.85791: done checking for max_fail_percentage 7557 1726882084.85791: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.85792: done checking to see if all hosts have failed 7557 1726882084.85796: getting the remaining hosts for this loop 7557 1726882084.85797: done getting the remaining hosts for this loop 7557 1726882084.85801: getting the next task for host managed_node3 7557 1726882084.85808: done getting next task for host managed_node3 7557 1726882084.85814: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882084.85818: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.85834: getting variables 7557 1726882084.85836: in VariableManager get_vars() 7557 1726882084.85886: Calling all_inventory to load vars for managed_node3 7557 1726882084.85889: Calling groups_inventory to load vars for managed_node3 7557 1726882084.85891: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.86117: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.86121: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.86124: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.86290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.86549: done with get_vars() 7557 1726882084.86560: done getting variables 7557 1726882084.86588: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000000e 7557 1726882084.86591: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:28:04 -0400 (0:00:00.035) 0:00:10.719 ****** 7557 1726882084.86658: entering _queue_task() for managed_node3/include_tasks 7557 1726882084.86878: worker is 1 (out of 1 available) 7557 1726882084.86890: exiting _queue_task() for managed_node3/include_tasks 7557 1726882084.87071: done queuing things up, now waiting for results queue to drain 7557 1726882084.87073: waiting for pending results... 7557 1726882084.87342: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882084.87496: in run() - task 12673a56-9f93-ed48-b3a5-000000000016 7557 1726882084.87526: variable 'ansible_search_path' from source: unknown 7557 1726882084.87535: variable 'ansible_search_path' from source: unknown 7557 1726882084.87576: calling self._execute() 7557 1726882084.87670: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.87684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.87702: variable 'omit' from source: magic vars 7557 1726882084.88140: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.88156: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.88168: _execute() done 7557 1726882084.88180: dumping result to json 7557 1726882084.88188: done dumping result, returning 7557 1726882084.88203: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-ed48-b3a5-000000000016] 7557 1726882084.88215: sending task result for task 12673a56-9f93-ed48-b3a5-000000000016 7557 1726882084.88331: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000016 7557 1726882084.88334: WORKER PROCESS EXITING 7557 1726882084.88345: no more pending results, returning what we have 7557 1726882084.88350: in VariableManager get_vars() 7557 1726882084.88399: Calling all_inventory to load vars for managed_node3 7557 1726882084.88402: Calling groups_inventory to load vars for managed_node3 7557 1726882084.88404: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.88414: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.88416: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.88419: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.88570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.88684: done with get_vars() 7557 1726882084.88690: variable 'ansible_search_path' from source: unknown 7557 1726882084.88690: variable 'ansible_search_path' from source: unknown 7557 1726882084.88717: we have included files to process 7557 1726882084.88718: generating all_blocks data 7557 1726882084.88719: done generating all_blocks data 7557 1726882084.88721: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882084.88723: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882084.88724: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882084.89298: done processing included file 7557 1726882084.89300: iterating over new_blocks loaded from include file 7557 1726882084.89301: in VariableManager get_vars() 7557 1726882084.89402: done with get_vars() 7557 1726882084.89404: filtering new block on tags 7557 1726882084.89418: done filtering new block on tags 7557 1726882084.89420: in VariableManager get_vars() 7557 1726882084.89445: done with get_vars() 7557 1726882084.89446: filtering new block on tags 7557 1726882084.89463: done filtering new block on tags 7557 1726882084.89465: in VariableManager get_vars() 7557 1726882084.89490: done with get_vars() 7557 1726882084.89491: filtering new block on tags 7557 1726882084.89509: done filtering new block on tags 7557 1726882084.89511: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7557 1726882084.89517: extending task lists for all hosts with included blocks 7557 1726882084.90284: done extending task lists 7557 1726882084.90286: done processing included files 7557 1726882084.90286: results queue empty 7557 1726882084.90287: checking for any_errors_fatal 7557 1726882084.90289: done checking for any_errors_fatal 7557 1726882084.90290: checking for max_fail_percentage 7557 1726882084.90291: done checking for max_fail_percentage 7557 1726882084.90292: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.90294: done checking to see if all hosts have failed 7557 1726882084.90295: getting the remaining hosts for this loop 7557 1726882084.90296: done getting the remaining hosts for this loop 7557 1726882084.90299: getting the next task for host managed_node3 7557 1726882084.90302: done getting next task for host managed_node3 7557 1726882084.90305: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882084.90308: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.90317: getting variables 7557 1726882084.90318: in VariableManager get_vars() 7557 1726882084.90335: Calling all_inventory to load vars for managed_node3 7557 1726882084.90338: Calling groups_inventory to load vars for managed_node3 7557 1726882084.90339: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.90344: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.90347: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.90350: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.90478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.90666: done with get_vars() 7557 1726882084.90675: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:28:04 -0400 (0:00:00.040) 0:00:10.760 ****** 7557 1726882084.90746: entering _queue_task() for managed_node3/setup 7557 1726882084.90983: worker is 1 (out of 1 available) 7557 1726882084.91231: exiting _queue_task() for managed_node3/setup 7557 1726882084.91242: done queuing things up, now waiting for results queue to drain 7557 1726882084.91243: waiting for pending results... 7557 1726882084.91324: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882084.91577: in run() - task 12673a56-9f93-ed48-b3a5-000000000809 7557 1726882084.91581: variable 'ansible_search_path' from source: unknown 7557 1726882084.91583: variable 'ansible_search_path' from source: unknown 7557 1726882084.91586: calling self._execute() 7557 1726882084.91643: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.91655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.91670: variable 'omit' from source: magic vars 7557 1726882084.92049: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.92066: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.92319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882084.94417: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882084.94498: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882084.94540: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882084.94577: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882084.94698: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882084.94703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882084.94721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882084.94750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882084.94795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882084.94822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882084.94877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882084.94907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882084.94941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882084.94982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882084.95035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882084.95162: variable '__network_required_facts' from source: role '' defaults 7557 1726882084.95176: variable 'ansible_facts' from source: unknown 7557 1726882084.95271: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7557 1726882084.95279: when evaluation is False, skipping this task 7557 1726882084.95288: _execute() done 7557 1726882084.95362: dumping result to json 7557 1726882084.95365: done dumping result, returning 7557 1726882084.95368: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-ed48-b3a5-000000000809] 7557 1726882084.95370: sending task result for task 12673a56-9f93-ed48-b3a5-000000000809 7557 1726882084.95441: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000809 7557 1726882084.95444: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882084.95509: no more pending results, returning what we have 7557 1726882084.95514: results queue empty 7557 1726882084.95515: checking for any_errors_fatal 7557 1726882084.95517: done checking for any_errors_fatal 7557 1726882084.95518: checking for max_fail_percentage 7557 1726882084.95519: done checking for max_fail_percentage 7557 1726882084.95520: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.95521: done checking to see if all hosts have failed 7557 1726882084.95522: getting the remaining hosts for this loop 7557 1726882084.95523: done getting the remaining hosts for this loop 7557 1726882084.95527: getting the next task for host managed_node3 7557 1726882084.95542: done getting next task for host managed_node3 7557 1726882084.95547: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882084.95551: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.95564: getting variables 7557 1726882084.95566: in VariableManager get_vars() 7557 1726882084.95622: Calling all_inventory to load vars for managed_node3 7557 1726882084.95625: Calling groups_inventory to load vars for managed_node3 7557 1726882084.95628: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.95639: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.95642: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.95645: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.96108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.96312: done with get_vars() 7557 1726882084.96323: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:28:04 -0400 (0:00:00.056) 0:00:10.817 ****** 7557 1726882084.96425: entering _queue_task() for managed_node3/stat 7557 1726882084.96646: worker is 1 (out of 1 available) 7557 1726882084.96658: exiting _queue_task() for managed_node3/stat 7557 1726882084.96670: done queuing things up, now waiting for results queue to drain 7557 1726882084.96671: waiting for pending results... 7557 1726882084.97002: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882084.97081: in run() - task 12673a56-9f93-ed48-b3a5-00000000080b 7557 1726882084.97099: variable 'ansible_search_path' from source: unknown 7557 1726882084.97103: variable 'ansible_search_path' from source: unknown 7557 1726882084.97126: calling self._execute() 7557 1726882084.97198: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.97202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.97211: variable 'omit' from source: magic vars 7557 1726882084.97465: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.97477: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882084.97586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882084.97773: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882084.97806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882084.97832: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882084.97857: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882084.97940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882084.97958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882084.97976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882084.97998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882084.98059: variable '__network_is_ostree' from source: set_fact 7557 1726882084.98065: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882084.98068: when evaluation is False, skipping this task 7557 1726882084.98071: _execute() done 7557 1726882084.98074: dumping result to json 7557 1726882084.98076: done dumping result, returning 7557 1726882084.98083: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-ed48-b3a5-00000000080b] 7557 1726882084.98087: sending task result for task 12673a56-9f93-ed48-b3a5-00000000080b 7557 1726882084.98167: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000080b 7557 1726882084.98170: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882084.98244: no more pending results, returning what we have 7557 1726882084.98247: results queue empty 7557 1726882084.98248: checking for any_errors_fatal 7557 1726882084.98253: done checking for any_errors_fatal 7557 1726882084.98254: checking for max_fail_percentage 7557 1726882084.98255: done checking for max_fail_percentage 7557 1726882084.98256: checking to see if all hosts have failed and the running result is not ok 7557 1726882084.98257: done checking to see if all hosts have failed 7557 1726882084.98258: getting the remaining hosts for this loop 7557 1726882084.98259: done getting the remaining hosts for this loop 7557 1726882084.98262: getting the next task for host managed_node3 7557 1726882084.98267: done getting next task for host managed_node3 7557 1726882084.98270: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882084.98274: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882084.98297: getting variables 7557 1726882084.98299: in VariableManager get_vars() 7557 1726882084.98331: Calling all_inventory to load vars for managed_node3 7557 1726882084.98333: Calling groups_inventory to load vars for managed_node3 7557 1726882084.98334: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882084.98340: Calling all_plugins_play to load vars for managed_node3 7557 1726882084.98342: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882084.98344: Calling groups_plugins_play to load vars for managed_node3 7557 1726882084.98454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882084.98578: done with get_vars() 7557 1726882084.98586: done getting variables 7557 1726882084.98643: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:28:04 -0400 (0:00:00.022) 0:00:10.839 ****** 7557 1726882084.98667: entering _queue_task() for managed_node3/set_fact 7557 1726882084.98906: worker is 1 (out of 1 available) 7557 1726882084.98920: exiting _queue_task() for managed_node3/set_fact 7557 1726882084.98934: done queuing things up, now waiting for results queue to drain 7557 1726882084.98935: waiting for pending results... 7557 1726882084.99236: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882084.99427: in run() - task 12673a56-9f93-ed48-b3a5-00000000080c 7557 1726882084.99432: variable 'ansible_search_path' from source: unknown 7557 1726882084.99434: variable 'ansible_search_path' from source: unknown 7557 1726882084.99439: calling self._execute() 7557 1726882084.99546: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882084.99613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882084.99617: variable 'omit' from source: magic vars 7557 1726882084.99941: variable 'ansible_distribution_major_version' from source: facts 7557 1726882084.99951: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882085.00068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882085.00259: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882085.00291: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882085.00321: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882085.00346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882085.00410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882085.00431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882085.00449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882085.00467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882085.00538: variable '__network_is_ostree' from source: set_fact 7557 1726882085.00541: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882085.00544: when evaluation is False, skipping this task 7557 1726882085.00546: _execute() done 7557 1726882085.00549: dumping result to json 7557 1726882085.00551: done dumping result, returning 7557 1726882085.00557: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-ed48-b3a5-00000000080c] 7557 1726882085.00563: sending task result for task 12673a56-9f93-ed48-b3a5-00000000080c 7557 1726882085.00645: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000080c 7557 1726882085.00648: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882085.00689: no more pending results, returning what we have 7557 1726882085.00695: results queue empty 7557 1726882085.00696: checking for any_errors_fatal 7557 1726882085.00701: done checking for any_errors_fatal 7557 1726882085.00701: checking for max_fail_percentage 7557 1726882085.00703: done checking for max_fail_percentage 7557 1726882085.00704: checking to see if all hosts have failed and the running result is not ok 7557 1726882085.00705: done checking to see if all hosts have failed 7557 1726882085.00706: getting the remaining hosts for this loop 7557 1726882085.00708: done getting the remaining hosts for this loop 7557 1726882085.00711: getting the next task for host managed_node3 7557 1726882085.00720: done getting next task for host managed_node3 7557 1726882085.00724: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882085.00727: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882085.00741: getting variables 7557 1726882085.00742: in VariableManager get_vars() 7557 1726882085.00788: Calling all_inventory to load vars for managed_node3 7557 1726882085.00790: Calling groups_inventory to load vars for managed_node3 7557 1726882085.00794: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882085.00803: Calling all_plugins_play to load vars for managed_node3 7557 1726882085.00805: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882085.00808: Calling groups_plugins_play to load vars for managed_node3 7557 1726882085.00973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882085.01102: done with get_vars() 7557 1726882085.01111: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:28:05 -0400 (0:00:00.025) 0:00:10.864 ****** 7557 1726882085.01175: entering _queue_task() for managed_node3/service_facts 7557 1726882085.01177: Creating lock for service_facts 7557 1726882085.01398: worker is 1 (out of 1 available) 7557 1726882085.01413: exiting _queue_task() for managed_node3/service_facts 7557 1726882085.01426: done queuing things up, now waiting for results queue to drain 7557 1726882085.01427: waiting for pending results... 7557 1726882085.01688: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882085.01760: in run() - task 12673a56-9f93-ed48-b3a5-00000000080e 7557 1726882085.01780: variable 'ansible_search_path' from source: unknown 7557 1726882085.01789: variable 'ansible_search_path' from source: unknown 7557 1726882085.01834: calling self._execute() 7557 1726882085.01936: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882085.01949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882085.01966: variable 'omit' from source: magic vars 7557 1726882085.02352: variable 'ansible_distribution_major_version' from source: facts 7557 1726882085.02371: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882085.02382: variable 'omit' from source: magic vars 7557 1726882085.02462: variable 'omit' from source: magic vars 7557 1726882085.02504: variable 'omit' from source: magic vars 7557 1726882085.02598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882085.02602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882085.02616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882085.02646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882085.02662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882085.02697: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882085.02706: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882085.02714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882085.02825: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882085.02847: Set connection var ansible_shell_executable to /bin/sh 7557 1726882085.02851: Set connection var ansible_shell_type to sh 7557 1726882085.02898: Set connection var ansible_pipelining to False 7557 1726882085.02902: Set connection var ansible_connection to ssh 7557 1726882085.02904: Set connection var ansible_timeout to 10 7557 1726882085.02906: variable 'ansible_shell_executable' from source: unknown 7557 1726882085.02909: variable 'ansible_connection' from source: unknown 7557 1726882085.02911: variable 'ansible_module_compression' from source: unknown 7557 1726882085.02921: variable 'ansible_shell_type' from source: unknown 7557 1726882085.02928: variable 'ansible_shell_executable' from source: unknown 7557 1726882085.02935: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882085.02955: variable 'ansible_pipelining' from source: unknown 7557 1726882085.02958: variable 'ansible_timeout' from source: unknown 7557 1726882085.02960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882085.03168: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882085.03183: variable 'omit' from source: magic vars 7557 1726882085.03186: starting attempt loop 7557 1726882085.03189: running the handler 7557 1726882085.03204: _low_level_execute_command(): starting 7557 1726882085.03211: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882085.03708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882085.03714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.03717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882085.03720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.03769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882085.03776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882085.03830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882085.05509: stdout chunk (state=3): >>>/root <<< 7557 1726882085.05622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882085.05658: stderr chunk (state=3): >>><<< 7557 1726882085.05681: stdout chunk (state=3): >>><<< 7557 1726882085.05707: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882085.05816: _low_level_execute_command(): starting 7557 1726882085.05820: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313 `" && echo ansible-tmp-1726882085.0571551-8070-274811532752313="` echo /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313 `" ) && sleep 0' 7557 1726882085.06401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882085.06423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882085.06445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.06469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.06526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882085.06530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882085.06581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882085.08467: stdout chunk (state=3): >>>ansible-tmp-1726882085.0571551-8070-274811532752313=/root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313 <<< 7557 1726882085.08625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882085.08628: stdout chunk (state=3): >>><<< 7557 1726882085.08631: stderr chunk (state=3): >>><<< 7557 1726882085.08644: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882085.0571551-8070-274811532752313=/root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882085.08800: variable 'ansible_module_compression' from source: unknown 7557 1726882085.08804: ANSIBALLZ: Using lock for service_facts 7557 1726882085.08806: ANSIBALLZ: Acquiring lock 7557 1726882085.08809: ANSIBALLZ: Lock acquired: 140194281375072 7557 1726882085.08813: ANSIBALLZ: Creating module 7557 1726882085.16689: ANSIBALLZ: Writing module into payload 7557 1726882085.16754: ANSIBALLZ: Writing module 7557 1726882085.16774: ANSIBALLZ: Renaming module 7557 1726882085.16779: ANSIBALLZ: Done creating module 7557 1726882085.16797: variable 'ansible_facts' from source: unknown 7557 1726882085.16845: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/AnsiballZ_service_facts.py 7557 1726882085.16945: Sending initial data 7557 1726882085.16948: Sent initial data (160 bytes) 7557 1726882085.17368: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882085.17403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882085.17406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882085.17408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.17410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882085.17412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882085.17414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.17456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882085.17467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882085.17522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882085.19080: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7557 1726882085.19083: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882085.19122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882085.19174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp769diiqy /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/AnsiballZ_service_facts.py <<< 7557 1726882085.19177: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/AnsiballZ_service_facts.py" <<< 7557 1726882085.19216: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp769diiqy" to remote "/root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/AnsiballZ_service_facts.py" <<< 7557 1726882085.19220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/AnsiballZ_service_facts.py" <<< 7557 1726882085.19770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882085.19812: stderr chunk (state=3): >>><<< 7557 1726882085.19815: stdout chunk (state=3): >>><<< 7557 1726882085.19836: done transferring module to remote 7557 1726882085.19844: _low_level_execute_command(): starting 7557 1726882085.19849: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/ /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/AnsiballZ_service_facts.py && sleep 0' 7557 1726882085.20254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882085.20287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882085.20295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.20298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882085.20300: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882085.20302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.20343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882085.20346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882085.20399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882085.22077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882085.22103: stderr chunk (state=3): >>><<< 7557 1726882085.22106: stdout chunk (state=3): >>><<< 7557 1726882085.22120: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882085.22123: _low_level_execute_command(): starting 7557 1726882085.22128: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/AnsiballZ_service_facts.py && sleep 0' 7557 1726882085.22559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882085.22562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882085.22565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882085.22567: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882085.22569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882085.22606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882085.22609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882085.22621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882085.22683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882086.85934: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 7557 1726882086.85953: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 7557 1726882086.85988: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7557 1726882086.87599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882086.87604: stdout chunk (state=3): >>><<< 7557 1726882086.87606: stderr chunk (state=3): >>><<< 7557 1726882086.87612: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882086.87997: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882086.88002: _low_level_execute_command(): starting 7557 1726882086.88008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882085.0571551-8070-274811532752313/ > /dev/null 2>&1 && sleep 0' 7557 1726882086.88471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882086.88475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882086.88477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882086.88479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882086.88481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882086.88534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882086.88537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882086.88542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882086.88587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882086.90416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882086.90438: stderr chunk (state=3): >>><<< 7557 1726882086.90441: stdout chunk (state=3): >>><<< 7557 1726882086.90455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882086.90460: handler run complete 7557 1726882086.90573: variable 'ansible_facts' from source: unknown 7557 1726882086.90670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882086.90930: variable 'ansible_facts' from source: unknown 7557 1726882086.91811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882086.91927: attempt loop complete, returning result 7557 1726882086.91931: _execute() done 7557 1726882086.91933: dumping result to json 7557 1726882086.91966: done dumping result, returning 7557 1726882086.91977: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-ed48-b3a5-00000000080e] 7557 1726882086.91984: sending task result for task 12673a56-9f93-ed48-b3a5-00000000080e 7557 1726882086.92850: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000080e 7557 1726882086.92855: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882086.92916: no more pending results, returning what we have 7557 1726882086.92920: results queue empty 7557 1726882086.92921: checking for any_errors_fatal 7557 1726882086.92926: done checking for any_errors_fatal 7557 1726882086.92927: checking for max_fail_percentage 7557 1726882086.92929: done checking for max_fail_percentage 7557 1726882086.92930: checking to see if all hosts have failed and the running result is not ok 7557 1726882086.92931: done checking to see if all hosts have failed 7557 1726882086.92931: getting the remaining hosts for this loop 7557 1726882086.92933: done getting the remaining hosts for this loop 7557 1726882086.92937: getting the next task for host managed_node3 7557 1726882086.92942: done getting next task for host managed_node3 7557 1726882086.92946: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882086.92951: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882086.92961: getting variables 7557 1726882086.92962: in VariableManager get_vars() 7557 1726882086.93008: Calling all_inventory to load vars for managed_node3 7557 1726882086.93012: Calling groups_inventory to load vars for managed_node3 7557 1726882086.93014: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882086.93024: Calling all_plugins_play to load vars for managed_node3 7557 1726882086.93027: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882086.93030: Calling groups_plugins_play to load vars for managed_node3 7557 1726882086.93424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882086.93902: done with get_vars() 7557 1726882086.93917: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:28:06 -0400 (0:00:01.928) 0:00:12.793 ****** 7557 1726882086.94029: entering _queue_task() for managed_node3/package_facts 7557 1726882086.94031: Creating lock for package_facts 7557 1726882086.94344: worker is 1 (out of 1 available) 7557 1726882086.94358: exiting _queue_task() for managed_node3/package_facts 7557 1726882086.94372: done queuing things up, now waiting for results queue to drain 7557 1726882086.94374: waiting for pending results... 7557 1726882086.94818: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882086.94846: in run() - task 12673a56-9f93-ed48-b3a5-00000000080f 7557 1726882086.94870: variable 'ansible_search_path' from source: unknown 7557 1726882086.94880: variable 'ansible_search_path' from source: unknown 7557 1726882086.94924: calling self._execute() 7557 1726882086.95026: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882086.95045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882086.95064: variable 'omit' from source: magic vars 7557 1726882086.95467: variable 'ansible_distribution_major_version' from source: facts 7557 1726882086.95496: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882086.95510: variable 'omit' from source: magic vars 7557 1726882086.95583: variable 'omit' from source: magic vars 7557 1726882086.95632: variable 'omit' from source: magic vars 7557 1726882086.95714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882086.95724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882086.95747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882086.95771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882086.95789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882086.95831: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882086.95898: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882086.95902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882086.95960: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882086.95974: Set connection var ansible_shell_executable to /bin/sh 7557 1726882086.95984: Set connection var ansible_shell_type to sh 7557 1726882086.95997: Set connection var ansible_pipelining to False 7557 1726882086.96006: Set connection var ansible_connection to ssh 7557 1726882086.96017: Set connection var ansible_timeout to 10 7557 1726882086.96051: variable 'ansible_shell_executable' from source: unknown 7557 1726882086.96061: variable 'ansible_connection' from source: unknown 7557 1726882086.96069: variable 'ansible_module_compression' from source: unknown 7557 1726882086.96076: variable 'ansible_shell_type' from source: unknown 7557 1726882086.96083: variable 'ansible_shell_executable' from source: unknown 7557 1726882086.96150: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882086.96153: variable 'ansible_pipelining' from source: unknown 7557 1726882086.96155: variable 'ansible_timeout' from source: unknown 7557 1726882086.96158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882086.96338: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882086.96358: variable 'omit' from source: magic vars 7557 1726882086.96376: starting attempt loop 7557 1726882086.96385: running the handler 7557 1726882086.96409: _low_level_execute_command(): starting 7557 1726882086.96426: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882086.97395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882086.97433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882086.97464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882086.97506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882086.97550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882086.99139: stdout chunk (state=3): >>>/root <<< 7557 1726882086.99297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882086.99301: stdout chunk (state=3): >>><<< 7557 1726882086.99303: stderr chunk (state=3): >>><<< 7557 1726882086.99326: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882086.99346: _low_level_execute_command(): starting 7557 1726882086.99435: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788 `" && echo ansible-tmp-1726882086.993336-8147-81704578648788="` echo /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788 `" ) && sleep 0' 7557 1726882086.99985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882087.00003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882087.00021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882087.00045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882087.00061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882087.00112: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882087.00185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882087.00214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882087.00230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882087.00312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882087.02169: stdout chunk (state=3): >>>ansible-tmp-1726882086.993336-8147-81704578648788=/root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788 <<< 7557 1726882087.02275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882087.02306: stderr chunk (state=3): >>><<< 7557 1726882087.02309: stdout chunk (state=3): >>><<< 7557 1726882087.02331: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882086.993336-8147-81704578648788=/root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882087.02369: variable 'ansible_module_compression' from source: unknown 7557 1726882087.02412: ANSIBALLZ: Using lock for package_facts 7557 1726882087.02416: ANSIBALLZ: Acquiring lock 7557 1726882087.02418: ANSIBALLZ: Lock acquired: 140194282304096 7557 1726882087.02422: ANSIBALLZ: Creating module 7557 1726882087.34983: ANSIBALLZ: Writing module into payload 7557 1726882087.35050: ANSIBALLZ: Writing module 7557 1726882087.35075: ANSIBALLZ: Renaming module 7557 1726882087.35086: ANSIBALLZ: Done creating module 7557 1726882087.35130: variable 'ansible_facts' from source: unknown 7557 1726882087.35319: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/AnsiballZ_package_facts.py 7557 1726882087.35631: Sending initial data 7557 1726882087.35735: Sent initial data (158 bytes) 7557 1726882087.36143: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882087.36150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882087.36163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882087.36169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882087.36191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882087.36201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882087.36267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882087.36271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882087.36273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882087.36276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882087.36300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882087.36370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882087.38189: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7557 1726882087.38192: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7557 1726882087.38243: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7557 1726882087.38246: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7557 1726882087.38249: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882087.38299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882087.38361: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpj2ppt5wf /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/AnsiballZ_package_facts.py <<< 7557 1726882087.38364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/AnsiballZ_package_facts.py" <<< 7557 1726882087.38404: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpj2ppt5wf" to remote "/root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/AnsiballZ_package_facts.py" <<< 7557 1726882087.39984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882087.39987: stdout chunk (state=3): >>><<< 7557 1726882087.39990: stderr chunk (state=3): >>><<< 7557 1726882087.40004: done transferring module to remote 7557 1726882087.40018: _low_level_execute_command(): starting 7557 1726882087.40028: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/ /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/AnsiballZ_package_facts.py && sleep 0' 7557 1726882087.40683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882087.40701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882087.40716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882087.40733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882087.40757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882087.40861: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882087.40886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882087.40973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882087.43131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882087.43136: stdout chunk (state=3): >>><<< 7557 1726882087.43138: stderr chunk (state=3): >>><<< 7557 1726882087.43140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882087.43143: _low_level_execute_command(): starting 7557 1726882087.43145: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/AnsiballZ_package_facts.py && sleep 0' 7557 1726882087.44177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882087.44214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882087.44318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882087.88049: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 7557 1726882087.88257: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7557 1726882087.89940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882087.90099: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 7557 1726882087.90102: stderr chunk (state=3): >>><<< 7557 1726882087.90105: stdout chunk (state=3): >>><<< 7557 1726882087.90202: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882087.93475: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882087.93562: _low_level_execute_command(): starting 7557 1726882087.93565: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882086.993336-8147-81704578648788/ > /dev/null 2>&1 && sleep 0' 7557 1726882087.94299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882087.94317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882087.94323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882087.94336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882087.94408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882087.94436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882087.94481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882087.94484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882087.94527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882087.96531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882087.96534: stdout chunk (state=3): >>><<< 7557 1726882087.96536: stderr chunk (state=3): >>><<< 7557 1726882087.96539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882087.96541: handler run complete 7557 1726882088.09698: variable 'ansible_facts' from source: unknown 7557 1726882088.10737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.15398: variable 'ansible_facts' from source: unknown 7557 1726882088.16313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.17121: attempt loop complete, returning result 7557 1726882088.17152: _execute() done 7557 1726882088.17198: dumping result to json 7557 1726882088.17424: done dumping result, returning 7557 1726882088.17443: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-ed48-b3a5-00000000080f] 7557 1726882088.17456: sending task result for task 12673a56-9f93-ed48-b3a5-00000000080f 7557 1726882088.21240: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000080f 7557 1726882088.21245: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882088.21339: no more pending results, returning what we have 7557 1726882088.21342: results queue empty 7557 1726882088.21343: checking for any_errors_fatal 7557 1726882088.21351: done checking for any_errors_fatal 7557 1726882088.21353: checking for max_fail_percentage 7557 1726882088.21354: done checking for max_fail_percentage 7557 1726882088.21355: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.21356: done checking to see if all hosts have failed 7557 1726882088.21356: getting the remaining hosts for this loop 7557 1726882088.21358: done getting the remaining hosts for this loop 7557 1726882088.21361: getting the next task for host managed_node3 7557 1726882088.21367: done getting next task for host managed_node3 7557 1726882088.21371: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882088.21373: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.21385: getting variables 7557 1726882088.21386: in VariableManager get_vars() 7557 1726882088.21432: Calling all_inventory to load vars for managed_node3 7557 1726882088.21435: Calling groups_inventory to load vars for managed_node3 7557 1726882088.21437: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.21446: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.21449: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.21451: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.22869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.25262: done with get_vars() 7557 1726882088.25290: done getting variables 7557 1726882088.25348: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:28:08 -0400 (0:00:01.313) 0:00:14.106 ****** 7557 1726882088.25378: entering _queue_task() for managed_node3/debug 7557 1726882088.25615: worker is 1 (out of 1 available) 7557 1726882088.25630: exiting _queue_task() for managed_node3/debug 7557 1726882088.25643: done queuing things up, now waiting for results queue to drain 7557 1726882088.25644: waiting for pending results... 7557 1726882088.25822: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882088.25913: in run() - task 12673a56-9f93-ed48-b3a5-000000000017 7557 1726882088.25926: variable 'ansible_search_path' from source: unknown 7557 1726882088.25930: variable 'ansible_search_path' from source: unknown 7557 1726882088.25960: calling self._execute() 7557 1726882088.26035: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.26041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.26050: variable 'omit' from source: magic vars 7557 1726882088.26331: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.26341: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.26347: variable 'omit' from source: magic vars 7557 1726882088.26384: variable 'omit' from source: magic vars 7557 1726882088.26461: variable 'network_provider' from source: set_fact 7557 1726882088.26475: variable 'omit' from source: magic vars 7557 1726882088.26511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882088.26542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882088.26557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882088.26569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882088.26580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882088.26610: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882088.26613: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.26671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.26736: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882088.26744: Set connection var ansible_shell_executable to /bin/sh 7557 1726882088.26797: Set connection var ansible_shell_type to sh 7557 1726882088.26801: Set connection var ansible_pipelining to False 7557 1726882088.26803: Set connection var ansible_connection to ssh 7557 1726882088.26805: Set connection var ansible_timeout to 10 7557 1726882088.26808: variable 'ansible_shell_executable' from source: unknown 7557 1726882088.26810: variable 'ansible_connection' from source: unknown 7557 1726882088.26816: variable 'ansible_module_compression' from source: unknown 7557 1726882088.26828: variable 'ansible_shell_type' from source: unknown 7557 1726882088.26835: variable 'ansible_shell_executable' from source: unknown 7557 1726882088.26842: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.26855: variable 'ansible_pipelining' from source: unknown 7557 1726882088.26863: variable 'ansible_timeout' from source: unknown 7557 1726882088.26895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.27042: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882088.27198: variable 'omit' from source: magic vars 7557 1726882088.27202: starting attempt loop 7557 1726882088.27205: running the handler 7557 1726882088.27207: handler run complete 7557 1726882088.27210: attempt loop complete, returning result 7557 1726882088.27212: _execute() done 7557 1726882088.27214: dumping result to json 7557 1726882088.27216: done dumping result, returning 7557 1726882088.27218: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-ed48-b3a5-000000000017] 7557 1726882088.27221: sending task result for task 12673a56-9f93-ed48-b3a5-000000000017 7557 1726882088.27294: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000017 7557 1726882088.27300: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7557 1726882088.27372: no more pending results, returning what we have 7557 1726882088.27377: results queue empty 7557 1726882088.27378: checking for any_errors_fatal 7557 1726882088.27391: done checking for any_errors_fatal 7557 1726882088.27391: checking for max_fail_percentage 7557 1726882088.27395: done checking for max_fail_percentage 7557 1726882088.27397: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.27398: done checking to see if all hosts have failed 7557 1726882088.27399: getting the remaining hosts for this loop 7557 1726882088.27401: done getting the remaining hosts for this loop 7557 1726882088.27404: getting the next task for host managed_node3 7557 1726882088.27410: done getting next task for host managed_node3 7557 1726882088.27414: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882088.27418: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.27429: getting variables 7557 1726882088.27431: in VariableManager get_vars() 7557 1726882088.27481: Calling all_inventory to load vars for managed_node3 7557 1726882088.27484: Calling groups_inventory to load vars for managed_node3 7557 1726882088.27486: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.27601: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.27606: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.27609: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.29466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.30715: done with get_vars() 7557 1726882088.30738: done getting variables 7557 1726882088.30782: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:28:08 -0400 (0:00:00.054) 0:00:14.161 ****** 7557 1726882088.30811: entering _queue_task() for managed_node3/fail 7557 1726882088.31050: worker is 1 (out of 1 available) 7557 1726882088.31064: exiting _queue_task() for managed_node3/fail 7557 1726882088.31077: done queuing things up, now waiting for results queue to drain 7557 1726882088.31078: waiting for pending results... 7557 1726882088.31256: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882088.31355: in run() - task 12673a56-9f93-ed48-b3a5-000000000018 7557 1726882088.31367: variable 'ansible_search_path' from source: unknown 7557 1726882088.31372: variable 'ansible_search_path' from source: unknown 7557 1726882088.31403: calling self._execute() 7557 1726882088.31474: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.31478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.31486: variable 'omit' from source: magic vars 7557 1726882088.31901: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.31905: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.32039: variable 'network_state' from source: role '' defaults 7557 1726882088.32055: Evaluated conditional (network_state != {}): False 7557 1726882088.32063: when evaluation is False, skipping this task 7557 1726882088.32070: _execute() done 7557 1726882088.32077: dumping result to json 7557 1726882088.32083: done dumping result, returning 7557 1726882088.32100: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-ed48-b3a5-000000000018] 7557 1726882088.32140: sending task result for task 12673a56-9f93-ed48-b3a5-000000000018 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882088.32340: no more pending results, returning what we have 7557 1726882088.32345: results queue empty 7557 1726882088.32346: checking for any_errors_fatal 7557 1726882088.32352: done checking for any_errors_fatal 7557 1726882088.32353: checking for max_fail_percentage 7557 1726882088.32355: done checking for max_fail_percentage 7557 1726882088.32355: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.32356: done checking to see if all hosts have failed 7557 1726882088.32357: getting the remaining hosts for this loop 7557 1726882088.32359: done getting the remaining hosts for this loop 7557 1726882088.32362: getting the next task for host managed_node3 7557 1726882088.32370: done getting next task for host managed_node3 7557 1726882088.32374: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882088.32377: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.32499: getting variables 7557 1726882088.32502: in VariableManager get_vars() 7557 1726882088.32560: Calling all_inventory to load vars for managed_node3 7557 1726882088.32564: Calling groups_inventory to load vars for managed_node3 7557 1726882088.32566: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.32625: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.32629: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.32632: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.33200: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000018 7557 1726882088.33204: WORKER PROCESS EXITING 7557 1726882088.33865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.34730: done with get_vars() 7557 1726882088.34749: done getting variables 7557 1726882088.34796: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:28:08 -0400 (0:00:00.040) 0:00:14.201 ****** 7557 1726882088.34822: entering _queue_task() for managed_node3/fail 7557 1726882088.35049: worker is 1 (out of 1 available) 7557 1726882088.35065: exiting _queue_task() for managed_node3/fail 7557 1726882088.35078: done queuing things up, now waiting for results queue to drain 7557 1726882088.35079: waiting for pending results... 7557 1726882088.35252: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882088.35349: in run() - task 12673a56-9f93-ed48-b3a5-000000000019 7557 1726882088.35359: variable 'ansible_search_path' from source: unknown 7557 1726882088.35363: variable 'ansible_search_path' from source: unknown 7557 1726882088.35391: calling self._execute() 7557 1726882088.35465: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.35469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.35479: variable 'omit' from source: magic vars 7557 1726882088.35740: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.35761: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.35842: variable 'network_state' from source: role '' defaults 7557 1726882088.35857: Evaluated conditional (network_state != {}): False 7557 1726882088.35862: when evaluation is False, skipping this task 7557 1726882088.35864: _execute() done 7557 1726882088.35867: dumping result to json 7557 1726882088.35869: done dumping result, returning 7557 1726882088.35872: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-ed48-b3a5-000000000019] 7557 1726882088.35875: sending task result for task 12673a56-9f93-ed48-b3a5-000000000019 7557 1726882088.35957: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000019 7557 1726882088.35961: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882088.36015: no more pending results, returning what we have 7557 1726882088.36019: results queue empty 7557 1726882088.36020: checking for any_errors_fatal 7557 1726882088.36027: done checking for any_errors_fatal 7557 1726882088.36027: checking for max_fail_percentage 7557 1726882088.36029: done checking for max_fail_percentage 7557 1726882088.36030: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.36031: done checking to see if all hosts have failed 7557 1726882088.36031: getting the remaining hosts for this loop 7557 1726882088.36033: done getting the remaining hosts for this loop 7557 1726882088.36036: getting the next task for host managed_node3 7557 1726882088.36043: done getting next task for host managed_node3 7557 1726882088.36047: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882088.36050: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.36065: getting variables 7557 1726882088.36066: in VariableManager get_vars() 7557 1726882088.36119: Calling all_inventory to load vars for managed_node3 7557 1726882088.36122: Calling groups_inventory to load vars for managed_node3 7557 1726882088.36124: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.36132: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.36135: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.36137: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.36950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.37810: done with get_vars() 7557 1726882088.37825: done getting variables 7557 1726882088.37869: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:28:08 -0400 (0:00:00.030) 0:00:14.231 ****** 7557 1726882088.37896: entering _queue_task() for managed_node3/fail 7557 1726882088.38124: worker is 1 (out of 1 available) 7557 1726882088.38137: exiting _queue_task() for managed_node3/fail 7557 1726882088.38150: done queuing things up, now waiting for results queue to drain 7557 1726882088.38151: waiting for pending results... 7557 1726882088.38324: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882088.38418: in run() - task 12673a56-9f93-ed48-b3a5-00000000001a 7557 1726882088.38431: variable 'ansible_search_path' from source: unknown 7557 1726882088.38435: variable 'ansible_search_path' from source: unknown 7557 1726882088.38463: calling self._execute() 7557 1726882088.38536: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.38541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.38549: variable 'omit' from source: magic vars 7557 1726882088.38813: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.38825: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.38948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882088.40440: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882088.40496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882088.40523: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882088.40548: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882088.40570: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882088.40631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.40650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.40671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.40701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.40712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.40781: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.40797: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7557 1726882088.40872: variable 'ansible_distribution' from source: facts 7557 1726882088.40878: variable '__network_rh_distros' from source: role '' defaults 7557 1726882088.40888: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7557 1726882088.41043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.41059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.41076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.41111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.41120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.41151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.41167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.41183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.41214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.41226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.41255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.41271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.41287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.41316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.41331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.41517: variable 'network_connections' from source: task vars 7557 1726882088.41526: variable 'interface' from source: play vars 7557 1726882088.41579: variable 'interface' from source: play vars 7557 1726882088.41590: variable 'network_state' from source: role '' defaults 7557 1726882088.41637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882088.41753: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882088.41781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882088.41806: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882088.41828: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882088.41858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882088.41876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882088.41900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.41919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882088.41947: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7557 1726882088.41951: when evaluation is False, skipping this task 7557 1726882088.41953: _execute() done 7557 1726882088.41956: dumping result to json 7557 1726882088.41958: done dumping result, returning 7557 1726882088.41965: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-ed48-b3a5-00000000001a] 7557 1726882088.41970: sending task result for task 12673a56-9f93-ed48-b3a5-00000000001a 7557 1726882088.42057: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000001a 7557 1726882088.42060: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7557 1726882088.42127: no more pending results, returning what we have 7557 1726882088.42131: results queue empty 7557 1726882088.42131: checking for any_errors_fatal 7557 1726882088.42137: done checking for any_errors_fatal 7557 1726882088.42138: checking for max_fail_percentage 7557 1726882088.42140: done checking for max_fail_percentage 7557 1726882088.42140: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.42141: done checking to see if all hosts have failed 7557 1726882088.42142: getting the remaining hosts for this loop 7557 1726882088.42143: done getting the remaining hosts for this loop 7557 1726882088.42146: getting the next task for host managed_node3 7557 1726882088.42153: done getting next task for host managed_node3 7557 1726882088.42157: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882088.42160: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.42173: getting variables 7557 1726882088.42175: in VariableManager get_vars() 7557 1726882088.42225: Calling all_inventory to load vars for managed_node3 7557 1726882088.42227: Calling groups_inventory to load vars for managed_node3 7557 1726882088.42230: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.42238: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.42240: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.42243: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.43021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.43970: done with get_vars() 7557 1726882088.43986: done getting variables 7557 1726882088.44065: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:28:08 -0400 (0:00:00.061) 0:00:14.293 ****** 7557 1726882088.44087: entering _queue_task() for managed_node3/dnf 7557 1726882088.44327: worker is 1 (out of 1 available) 7557 1726882088.44341: exiting _queue_task() for managed_node3/dnf 7557 1726882088.44354: done queuing things up, now waiting for results queue to drain 7557 1726882088.44355: waiting for pending results... 7557 1726882088.44535: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882088.44637: in run() - task 12673a56-9f93-ed48-b3a5-00000000001b 7557 1726882088.44648: variable 'ansible_search_path' from source: unknown 7557 1726882088.44652: variable 'ansible_search_path' from source: unknown 7557 1726882088.44681: calling self._execute() 7557 1726882088.44756: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.44759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.44768: variable 'omit' from source: magic vars 7557 1726882088.45038: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.45048: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.45183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882088.46660: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882088.46715: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882088.46742: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882088.46769: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882088.46790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882088.46850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.46870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.46896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.46924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.46934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.47018: variable 'ansible_distribution' from source: facts 7557 1726882088.47021: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.47033: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7557 1726882088.47110: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882088.47208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.47215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.47233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.47257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.47268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.47298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.47316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.47334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.47358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.47369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.47399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.47414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.47434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.47458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.47468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.47570: variable 'network_connections' from source: task vars 7557 1726882088.47580: variable 'interface' from source: play vars 7557 1726882088.47630: variable 'interface' from source: play vars 7557 1726882088.47681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882088.47791: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882088.47820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882088.47842: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882088.47864: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882088.47901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882088.47917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882088.47938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.47956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882088.48006: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882088.48167: variable 'network_connections' from source: task vars 7557 1726882088.48170: variable 'interface' from source: play vars 7557 1726882088.48220: variable 'interface' from source: play vars 7557 1726882088.48245: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882088.48249: when evaluation is False, skipping this task 7557 1726882088.48252: _execute() done 7557 1726882088.48254: dumping result to json 7557 1726882088.48256: done dumping result, returning 7557 1726882088.48263: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-00000000001b] 7557 1726882088.48268: sending task result for task 12673a56-9f93-ed48-b3a5-00000000001b 7557 1726882088.48356: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000001b 7557 1726882088.48358: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882088.48411: no more pending results, returning what we have 7557 1726882088.48415: results queue empty 7557 1726882088.48415: checking for any_errors_fatal 7557 1726882088.48421: done checking for any_errors_fatal 7557 1726882088.48422: checking for max_fail_percentage 7557 1726882088.48424: done checking for max_fail_percentage 7557 1726882088.48425: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.48425: done checking to see if all hosts have failed 7557 1726882088.48426: getting the remaining hosts for this loop 7557 1726882088.48428: done getting the remaining hosts for this loop 7557 1726882088.48431: getting the next task for host managed_node3 7557 1726882088.48438: done getting next task for host managed_node3 7557 1726882088.48442: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882088.48444: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.48458: getting variables 7557 1726882088.48459: in VariableManager get_vars() 7557 1726882088.48511: Calling all_inventory to load vars for managed_node3 7557 1726882088.48514: Calling groups_inventory to load vars for managed_node3 7557 1726882088.48516: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.48525: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.48528: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.48530: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.49307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.50155: done with get_vars() 7557 1726882088.50170: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882088.50226: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:28:08 -0400 (0:00:00.061) 0:00:14.355 ****** 7557 1726882088.50250: entering _queue_task() for managed_node3/yum 7557 1726882088.50251: Creating lock for yum 7557 1726882088.50474: worker is 1 (out of 1 available) 7557 1726882088.50487: exiting _queue_task() for managed_node3/yum 7557 1726882088.50503: done queuing things up, now waiting for results queue to drain 7557 1726882088.50505: waiting for pending results... 7557 1726882088.50674: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882088.50767: in run() - task 12673a56-9f93-ed48-b3a5-00000000001c 7557 1726882088.50778: variable 'ansible_search_path' from source: unknown 7557 1726882088.50782: variable 'ansible_search_path' from source: unknown 7557 1726882088.50814: calling self._execute() 7557 1726882088.50883: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.50888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.50901: variable 'omit' from source: magic vars 7557 1726882088.51164: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.51175: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.51296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882088.52781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882088.52838: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882088.52865: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882088.52890: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882088.52918: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882088.52972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.52992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.53016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.53044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.53055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.53125: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.53139: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7557 1726882088.53142: when evaluation is False, skipping this task 7557 1726882088.53145: _execute() done 7557 1726882088.53148: dumping result to json 7557 1726882088.53150: done dumping result, returning 7557 1726882088.53158: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-00000000001c] 7557 1726882088.53162: sending task result for task 12673a56-9f93-ed48-b3a5-00000000001c 7557 1726882088.53250: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000001c 7557 1726882088.53252: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7557 1726882088.53299: no more pending results, returning what we have 7557 1726882088.53303: results queue empty 7557 1726882088.53304: checking for any_errors_fatal 7557 1726882088.53309: done checking for any_errors_fatal 7557 1726882088.53310: checking for max_fail_percentage 7557 1726882088.53311: done checking for max_fail_percentage 7557 1726882088.53312: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.53313: done checking to see if all hosts have failed 7557 1726882088.53313: getting the remaining hosts for this loop 7557 1726882088.53315: done getting the remaining hosts for this loop 7557 1726882088.53318: getting the next task for host managed_node3 7557 1726882088.53325: done getting next task for host managed_node3 7557 1726882088.53329: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882088.53332: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.53345: getting variables 7557 1726882088.53346: in VariableManager get_vars() 7557 1726882088.53400: Calling all_inventory to load vars for managed_node3 7557 1726882088.53403: Calling groups_inventory to load vars for managed_node3 7557 1726882088.53405: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.53414: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.53416: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.53418: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.56868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.57708: done with get_vars() 7557 1726882088.57724: done getting variables 7557 1726882088.57757: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:28:08 -0400 (0:00:00.075) 0:00:14.430 ****** 7557 1726882088.57779: entering _queue_task() for managed_node3/fail 7557 1726882088.58020: worker is 1 (out of 1 available) 7557 1726882088.58033: exiting _queue_task() for managed_node3/fail 7557 1726882088.58046: done queuing things up, now waiting for results queue to drain 7557 1726882088.58048: waiting for pending results... 7557 1726882088.58224: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882088.58325: in run() - task 12673a56-9f93-ed48-b3a5-00000000001d 7557 1726882088.58337: variable 'ansible_search_path' from source: unknown 7557 1726882088.58340: variable 'ansible_search_path' from source: unknown 7557 1726882088.58370: calling self._execute() 7557 1726882088.58448: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.58453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.58462: variable 'omit' from source: magic vars 7557 1726882088.58762: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.58772: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.58860: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882088.58987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882088.60447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882088.60499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882088.60529: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882088.60554: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882088.60576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882088.60637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.60657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.60678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.60709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.60720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.60752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.60767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.60788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.60817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.60829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.60857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.60872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.60895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.60920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.60931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.61043: variable 'network_connections' from source: task vars 7557 1726882088.61053: variable 'interface' from source: play vars 7557 1726882088.61113: variable 'interface' from source: play vars 7557 1726882088.61161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882088.61282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882088.61316: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882088.61341: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882088.61363: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882088.61396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882088.61413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882088.61435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.61451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882088.61496: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882088.61656: variable 'network_connections' from source: task vars 7557 1726882088.61659: variable 'interface' from source: play vars 7557 1726882088.61706: variable 'interface' from source: play vars 7557 1726882088.61731: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882088.61734: when evaluation is False, skipping this task 7557 1726882088.61737: _execute() done 7557 1726882088.61739: dumping result to json 7557 1726882088.61741: done dumping result, returning 7557 1726882088.61748: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-00000000001d] 7557 1726882088.61753: sending task result for task 12673a56-9f93-ed48-b3a5-00000000001d 7557 1726882088.61841: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000001d 7557 1726882088.61844: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882088.61920: no more pending results, returning what we have 7557 1726882088.61924: results queue empty 7557 1726882088.61925: checking for any_errors_fatal 7557 1726882088.61931: done checking for any_errors_fatal 7557 1726882088.61931: checking for max_fail_percentage 7557 1726882088.61934: done checking for max_fail_percentage 7557 1726882088.61934: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.61935: done checking to see if all hosts have failed 7557 1726882088.61936: getting the remaining hosts for this loop 7557 1726882088.61937: done getting the remaining hosts for this loop 7557 1726882088.61941: getting the next task for host managed_node3 7557 1726882088.61947: done getting next task for host managed_node3 7557 1726882088.61951: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7557 1726882088.61953: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.61967: getting variables 7557 1726882088.61968: in VariableManager get_vars() 7557 1726882088.62013: Calling all_inventory to load vars for managed_node3 7557 1726882088.62016: Calling groups_inventory to load vars for managed_node3 7557 1726882088.62018: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.62028: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.62030: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.62033: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.62821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.63713: done with get_vars() 7557 1726882088.63732: done getting variables 7557 1726882088.63775: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:28:08 -0400 (0:00:00.060) 0:00:14.490 ****** 7557 1726882088.63802: entering _queue_task() for managed_node3/package 7557 1726882088.64043: worker is 1 (out of 1 available) 7557 1726882088.64057: exiting _queue_task() for managed_node3/package 7557 1726882088.64070: done queuing things up, now waiting for results queue to drain 7557 1726882088.64072: waiting for pending results... 7557 1726882088.64243: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7557 1726882088.64332: in run() - task 12673a56-9f93-ed48-b3a5-00000000001e 7557 1726882088.64343: variable 'ansible_search_path' from source: unknown 7557 1726882088.64346: variable 'ansible_search_path' from source: unknown 7557 1726882088.64376: calling self._execute() 7557 1726882088.64454: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.64458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.64468: variable 'omit' from source: magic vars 7557 1726882088.64749: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.64759: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.64892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882088.65081: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882088.65115: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882088.65174: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882088.65211: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882088.65291: variable 'network_packages' from source: role '' defaults 7557 1726882088.65365: variable '__network_provider_setup' from source: role '' defaults 7557 1726882088.65373: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882088.65428: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882088.65438: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882088.65478: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882088.65592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882088.67173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882088.67216: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882088.67245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882088.67268: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882088.67290: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882088.67355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.67373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.67390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.67422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.67432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.67466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.67482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.67503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.67528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.67538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.67681: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882088.67755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.67787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.67807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.67831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.67842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.67905: variable 'ansible_python' from source: facts 7557 1726882088.67924: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882088.67979: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882088.68040: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882088.68126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.68142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.68159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.68183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.68194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.68231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.68250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.68267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.68290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.68305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.68401: variable 'network_connections' from source: task vars 7557 1726882088.68406: variable 'interface' from source: play vars 7557 1726882088.68478: variable 'interface' from source: play vars 7557 1726882088.68533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882088.68556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882088.68577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.68601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882088.68636: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882088.68816: variable 'network_connections' from source: task vars 7557 1726882088.68820: variable 'interface' from source: play vars 7557 1726882088.68890: variable 'interface' from source: play vars 7557 1726882088.68933: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882088.68986: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882088.69177: variable 'network_connections' from source: task vars 7557 1726882088.69181: variable 'interface' from source: play vars 7557 1726882088.69233: variable 'interface' from source: play vars 7557 1726882088.69252: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882088.69310: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882088.69498: variable 'network_connections' from source: task vars 7557 1726882088.69505: variable 'interface' from source: play vars 7557 1726882088.69553: variable 'interface' from source: play vars 7557 1726882088.69598: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882088.69645: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882088.69651: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882088.69692: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882088.69830: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882088.70129: variable 'network_connections' from source: task vars 7557 1726882088.70132: variable 'interface' from source: play vars 7557 1726882088.70177: variable 'interface' from source: play vars 7557 1726882088.70186: variable 'ansible_distribution' from source: facts 7557 1726882088.70188: variable '__network_rh_distros' from source: role '' defaults 7557 1726882088.70195: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.70215: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882088.70324: variable 'ansible_distribution' from source: facts 7557 1726882088.70327: variable '__network_rh_distros' from source: role '' defaults 7557 1726882088.70332: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.70343: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882088.70451: variable 'ansible_distribution' from source: facts 7557 1726882088.70454: variable '__network_rh_distros' from source: role '' defaults 7557 1726882088.70458: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.70483: variable 'network_provider' from source: set_fact 7557 1726882088.70502: variable 'ansible_facts' from source: unknown 7557 1726882088.70876: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7557 1726882088.70879: when evaluation is False, skipping this task 7557 1726882088.70882: _execute() done 7557 1726882088.70884: dumping result to json 7557 1726882088.70886: done dumping result, returning 7557 1726882088.70895: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-ed48-b3a5-00000000001e] 7557 1726882088.70902: sending task result for task 12673a56-9f93-ed48-b3a5-00000000001e 7557 1726882088.70988: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000001e 7557 1726882088.70990: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7557 1726882088.71040: no more pending results, returning what we have 7557 1726882088.71044: results queue empty 7557 1726882088.71045: checking for any_errors_fatal 7557 1726882088.71050: done checking for any_errors_fatal 7557 1726882088.71051: checking for max_fail_percentage 7557 1726882088.71053: done checking for max_fail_percentage 7557 1726882088.71053: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.71054: done checking to see if all hosts have failed 7557 1726882088.71055: getting the remaining hosts for this loop 7557 1726882088.71056: done getting the remaining hosts for this loop 7557 1726882088.71060: getting the next task for host managed_node3 7557 1726882088.71066: done getting next task for host managed_node3 7557 1726882088.71070: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882088.71072: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.71086: getting variables 7557 1726882088.71087: in VariableManager get_vars() 7557 1726882088.71139: Calling all_inventory to load vars for managed_node3 7557 1726882088.71142: Calling groups_inventory to load vars for managed_node3 7557 1726882088.71144: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.71153: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.71156: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.71159: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.72095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.72945: done with get_vars() 7557 1726882088.72961: done getting variables 7557 1726882088.73005: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:28:08 -0400 (0:00:00.092) 0:00:14.583 ****** 7557 1726882088.73028: entering _queue_task() for managed_node3/package 7557 1726882088.73261: worker is 1 (out of 1 available) 7557 1726882088.73276: exiting _queue_task() for managed_node3/package 7557 1726882088.73289: done queuing things up, now waiting for results queue to drain 7557 1726882088.73291: waiting for pending results... 7557 1726882088.73465: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882088.73566: in run() - task 12673a56-9f93-ed48-b3a5-00000000001f 7557 1726882088.73577: variable 'ansible_search_path' from source: unknown 7557 1726882088.73581: variable 'ansible_search_path' from source: unknown 7557 1726882088.73614: calling self._execute() 7557 1726882088.73690: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.73695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.73707: variable 'omit' from source: magic vars 7557 1726882088.73983: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.73994: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.74081: variable 'network_state' from source: role '' defaults 7557 1726882088.74089: Evaluated conditional (network_state != {}): False 7557 1726882088.74092: when evaluation is False, skipping this task 7557 1726882088.74096: _execute() done 7557 1726882088.74101: dumping result to json 7557 1726882088.74104: done dumping result, returning 7557 1726882088.74112: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-ed48-b3a5-00000000001f] 7557 1726882088.74115: sending task result for task 12673a56-9f93-ed48-b3a5-00000000001f 7557 1726882088.74203: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000001f 7557 1726882088.74205: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882088.74247: no more pending results, returning what we have 7557 1726882088.74251: results queue empty 7557 1726882088.74251: checking for any_errors_fatal 7557 1726882088.74257: done checking for any_errors_fatal 7557 1726882088.74257: checking for max_fail_percentage 7557 1726882088.74259: done checking for max_fail_percentage 7557 1726882088.74259: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.74260: done checking to see if all hosts have failed 7557 1726882088.74261: getting the remaining hosts for this loop 7557 1726882088.74262: done getting the remaining hosts for this loop 7557 1726882088.74265: getting the next task for host managed_node3 7557 1726882088.74272: done getting next task for host managed_node3 7557 1726882088.74276: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882088.74279: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.74297: getting variables 7557 1726882088.74299: in VariableManager get_vars() 7557 1726882088.74343: Calling all_inventory to load vars for managed_node3 7557 1726882088.74346: Calling groups_inventory to load vars for managed_node3 7557 1726882088.74348: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.74356: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.74358: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.74361: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.75091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.75939: done with get_vars() 7557 1726882088.75954: done getting variables 7557 1726882088.75995: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:28:08 -0400 (0:00:00.029) 0:00:14.613 ****** 7557 1726882088.76018: entering _queue_task() for managed_node3/package 7557 1726882088.76229: worker is 1 (out of 1 available) 7557 1726882088.76243: exiting _queue_task() for managed_node3/package 7557 1726882088.76256: done queuing things up, now waiting for results queue to drain 7557 1726882088.76257: waiting for pending results... 7557 1726882088.76431: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882088.76518: in run() - task 12673a56-9f93-ed48-b3a5-000000000020 7557 1726882088.76529: variable 'ansible_search_path' from source: unknown 7557 1726882088.76532: variable 'ansible_search_path' from source: unknown 7557 1726882088.76561: calling self._execute() 7557 1726882088.76637: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.76641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.76650: variable 'omit' from source: magic vars 7557 1726882088.76926: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.76934: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.77015: variable 'network_state' from source: role '' defaults 7557 1726882088.77029: Evaluated conditional (network_state != {}): False 7557 1726882088.77032: when evaluation is False, skipping this task 7557 1726882088.77035: _execute() done 7557 1726882088.77038: dumping result to json 7557 1726882088.77040: done dumping result, returning 7557 1726882088.77043: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-ed48-b3a5-000000000020] 7557 1726882088.77046: sending task result for task 12673a56-9f93-ed48-b3a5-000000000020 7557 1726882088.77132: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000020 7557 1726882088.77135: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882088.77179: no more pending results, returning what we have 7557 1726882088.77184: results queue empty 7557 1726882088.77184: checking for any_errors_fatal 7557 1726882088.77192: done checking for any_errors_fatal 7557 1726882088.77194: checking for max_fail_percentage 7557 1726882088.77196: done checking for max_fail_percentage 7557 1726882088.77197: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.77197: done checking to see if all hosts have failed 7557 1726882088.77198: getting the remaining hosts for this loop 7557 1726882088.77200: done getting the remaining hosts for this loop 7557 1726882088.77203: getting the next task for host managed_node3 7557 1726882088.77208: done getting next task for host managed_node3 7557 1726882088.77212: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882088.77215: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.77229: getting variables 7557 1726882088.77230: in VariableManager get_vars() 7557 1726882088.77269: Calling all_inventory to load vars for managed_node3 7557 1726882088.77272: Calling groups_inventory to load vars for managed_node3 7557 1726882088.77274: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.77282: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.77284: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.77286: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.78106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.78947: done with get_vars() 7557 1726882088.78962: done getting variables 7557 1726882088.79034: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:28:08 -0400 (0:00:00.030) 0:00:14.643 ****** 7557 1726882088.79056: entering _queue_task() for managed_node3/service 7557 1726882088.79057: Creating lock for service 7557 1726882088.79265: worker is 1 (out of 1 available) 7557 1726882088.79278: exiting _queue_task() for managed_node3/service 7557 1726882088.79292: done queuing things up, now waiting for results queue to drain 7557 1726882088.79295: waiting for pending results... 7557 1726882088.79462: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882088.79558: in run() - task 12673a56-9f93-ed48-b3a5-000000000021 7557 1726882088.79570: variable 'ansible_search_path' from source: unknown 7557 1726882088.79574: variable 'ansible_search_path' from source: unknown 7557 1726882088.79605: calling self._execute() 7557 1726882088.79675: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.79679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.79687: variable 'omit' from source: magic vars 7557 1726882088.79957: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.79974: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.80055: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882088.80187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882088.81647: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882088.81704: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882088.81730: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882088.81755: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882088.81775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882088.81837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.81857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.81874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.81904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.81920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.81953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.81969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.81985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.82015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.82028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.82059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.82074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.82090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.82119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.82130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.82251: variable 'network_connections' from source: task vars 7557 1726882088.82260: variable 'interface' from source: play vars 7557 1726882088.82318: variable 'interface' from source: play vars 7557 1726882088.82372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882088.82483: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882088.82521: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882088.82542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882088.82565: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882088.82600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882088.82616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882088.82633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.82650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882088.82698: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882088.82851: variable 'network_connections' from source: task vars 7557 1726882088.82855: variable 'interface' from source: play vars 7557 1726882088.82904: variable 'interface' from source: play vars 7557 1726882088.82928: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882088.82932: when evaluation is False, skipping this task 7557 1726882088.82935: _execute() done 7557 1726882088.82938: dumping result to json 7557 1726882088.82940: done dumping result, returning 7557 1726882088.82946: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-000000000021] 7557 1726882088.82951: sending task result for task 12673a56-9f93-ed48-b3a5-000000000021 7557 1726882088.83037: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000021 7557 1726882088.83046: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882088.83086: no more pending results, returning what we have 7557 1726882088.83090: results queue empty 7557 1726882088.83091: checking for any_errors_fatal 7557 1726882088.83099: done checking for any_errors_fatal 7557 1726882088.83100: checking for max_fail_percentage 7557 1726882088.83102: done checking for max_fail_percentage 7557 1726882088.83103: checking to see if all hosts have failed and the running result is not ok 7557 1726882088.83103: done checking to see if all hosts have failed 7557 1726882088.83104: getting the remaining hosts for this loop 7557 1726882088.83106: done getting the remaining hosts for this loop 7557 1726882088.83109: getting the next task for host managed_node3 7557 1726882088.83117: done getting next task for host managed_node3 7557 1726882088.83120: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882088.83123: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882088.83137: getting variables 7557 1726882088.83139: in VariableManager get_vars() 7557 1726882088.83186: Calling all_inventory to load vars for managed_node3 7557 1726882088.83189: Calling groups_inventory to load vars for managed_node3 7557 1726882088.83191: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882088.83203: Calling all_plugins_play to load vars for managed_node3 7557 1726882088.83205: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882088.83208: Calling groups_plugins_play to load vars for managed_node3 7557 1726882088.84001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882088.84947: done with get_vars() 7557 1726882088.84963: done getting variables 7557 1726882088.85008: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:28:08 -0400 (0:00:00.059) 0:00:14.703 ****** 7557 1726882088.85033: entering _queue_task() for managed_node3/service 7557 1726882088.85259: worker is 1 (out of 1 available) 7557 1726882088.85273: exiting _queue_task() for managed_node3/service 7557 1726882088.85287: done queuing things up, now waiting for results queue to drain 7557 1726882088.85288: waiting for pending results... 7557 1726882088.85464: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882088.85570: in run() - task 12673a56-9f93-ed48-b3a5-000000000022 7557 1726882088.85582: variable 'ansible_search_path' from source: unknown 7557 1726882088.85586: variable 'ansible_search_path' from source: unknown 7557 1726882088.85621: calling self._execute() 7557 1726882088.85692: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.85700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.85707: variable 'omit' from source: magic vars 7557 1726882088.85981: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.85990: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882088.86104: variable 'network_provider' from source: set_fact 7557 1726882088.86107: variable 'network_state' from source: role '' defaults 7557 1726882088.86117: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7557 1726882088.86123: variable 'omit' from source: magic vars 7557 1726882088.86157: variable 'omit' from source: magic vars 7557 1726882088.86180: variable 'network_service_name' from source: role '' defaults 7557 1726882088.86233: variable 'network_service_name' from source: role '' defaults 7557 1726882088.86304: variable '__network_provider_setup' from source: role '' defaults 7557 1726882088.86308: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882088.86351: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882088.86359: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882088.86406: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882088.86558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882088.87981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882088.88037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882088.88065: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882088.88090: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882088.88112: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882088.88171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.88191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.88212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.88240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.88252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.88284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.88304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.88321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.88345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.88359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.88507: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882088.88583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.88604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.88621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.88645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.88655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.88718: variable 'ansible_python' from source: facts 7557 1726882088.88735: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882088.88799: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882088.88847: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882088.88930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.88948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.88964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.88988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.89002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.89038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882088.89057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882088.89073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.89099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882088.89110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882088.89200: variable 'network_connections' from source: task vars 7557 1726882088.89203: variable 'interface' from source: play vars 7557 1726882088.89259: variable 'interface' from source: play vars 7557 1726882088.89334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882088.89465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882088.89502: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882088.89534: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882088.89563: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882088.89608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882088.89630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882088.89651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882088.89677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882088.89713: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882088.89883: variable 'network_connections' from source: task vars 7557 1726882088.89886: variable 'interface' from source: play vars 7557 1726882088.89941: variable 'interface' from source: play vars 7557 1726882088.89976: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882088.90035: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882088.90215: variable 'network_connections' from source: task vars 7557 1726882088.90220: variable 'interface' from source: play vars 7557 1726882088.90268: variable 'interface' from source: play vars 7557 1726882088.90287: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882088.90343: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882088.90524: variable 'network_connections' from source: task vars 7557 1726882088.90527: variable 'interface' from source: play vars 7557 1726882088.90577: variable 'interface' from source: play vars 7557 1726882088.90623: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882088.90665: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882088.90671: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882088.90714: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882088.90846: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882088.91154: variable 'network_connections' from source: task vars 7557 1726882088.91157: variable 'interface' from source: play vars 7557 1726882088.91201: variable 'interface' from source: play vars 7557 1726882088.91213: variable 'ansible_distribution' from source: facts 7557 1726882088.91216: variable '__network_rh_distros' from source: role '' defaults 7557 1726882088.91219: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.91238: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882088.91350: variable 'ansible_distribution' from source: facts 7557 1726882088.91354: variable '__network_rh_distros' from source: role '' defaults 7557 1726882088.91358: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.91369: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882088.91480: variable 'ansible_distribution' from source: facts 7557 1726882088.91483: variable '__network_rh_distros' from source: role '' defaults 7557 1726882088.91487: variable 'ansible_distribution_major_version' from source: facts 7557 1726882088.91515: variable 'network_provider' from source: set_fact 7557 1726882088.91538: variable 'omit' from source: magic vars 7557 1726882088.91557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882088.91578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882088.91592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882088.91607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882088.91616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882088.91641: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882088.91644: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.91646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.91719: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882088.91725: Set connection var ansible_shell_executable to /bin/sh 7557 1726882088.91728: Set connection var ansible_shell_type to sh 7557 1726882088.91733: Set connection var ansible_pipelining to False 7557 1726882088.91735: Set connection var ansible_connection to ssh 7557 1726882088.91740: Set connection var ansible_timeout to 10 7557 1726882088.91764: variable 'ansible_shell_executable' from source: unknown 7557 1726882088.91767: variable 'ansible_connection' from source: unknown 7557 1726882088.91769: variable 'ansible_module_compression' from source: unknown 7557 1726882088.91771: variable 'ansible_shell_type' from source: unknown 7557 1726882088.91774: variable 'ansible_shell_executable' from source: unknown 7557 1726882088.91776: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882088.91778: variable 'ansible_pipelining' from source: unknown 7557 1726882088.91779: variable 'ansible_timeout' from source: unknown 7557 1726882088.91781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882088.91851: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882088.91858: variable 'omit' from source: magic vars 7557 1726882088.91868: starting attempt loop 7557 1726882088.91870: running the handler 7557 1726882088.91925: variable 'ansible_facts' from source: unknown 7557 1726882088.92375: _low_level_execute_command(): starting 7557 1726882088.92380: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882088.92901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882088.92905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882088.92907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882088.92912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882088.92950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882088.92965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882088.93036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882088.94699: stdout chunk (state=3): >>>/root <<< 7557 1726882088.94786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882088.94825: stderr chunk (state=3): >>><<< 7557 1726882088.94828: stdout chunk (state=3): >>><<< 7557 1726882088.94847: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882088.94858: _low_level_execute_command(): starting 7557 1726882088.94864: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979 `" && echo ansible-tmp-1726882088.9484792-8211-40519577095979="` echo /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979 `" ) && sleep 0' 7557 1726882088.95332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882088.95335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882088.95338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882088.95340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882088.95342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882088.95399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882088.95405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882088.95407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882088.95450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882088.97289: stdout chunk (state=3): >>>ansible-tmp-1726882088.9484792-8211-40519577095979=/root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979 <<< 7557 1726882088.97400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882088.97428: stderr chunk (state=3): >>><<< 7557 1726882088.97431: stdout chunk (state=3): >>><<< 7557 1726882088.97448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882088.9484792-8211-40519577095979=/root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882088.97473: variable 'ansible_module_compression' from source: unknown 7557 1726882088.97523: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 7557 1726882088.97526: ANSIBALLZ: Acquiring lock 7557 1726882088.97529: ANSIBALLZ: Lock acquired: 140194287013904 7557 1726882088.97531: ANSIBALLZ: Creating module 7557 1726882089.17257: ANSIBALLZ: Writing module into payload 7557 1726882089.17359: ANSIBALLZ: Writing module 7557 1726882089.17383: ANSIBALLZ: Renaming module 7557 1726882089.17389: ANSIBALLZ: Done creating module 7557 1726882089.17420: variable 'ansible_facts' from source: unknown 7557 1726882089.17554: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/AnsiballZ_systemd.py 7557 1726882089.17661: Sending initial data 7557 1726882089.17664: Sent initial data (153 bytes) 7557 1726882089.18126: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882089.18134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.18139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882089.18141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882089.18143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.18189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882089.18196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882089.18199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882089.18257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882089.19907: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882089.19946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882089.19996: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp4nm6k21r /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/AnsiballZ_systemd.py <<< 7557 1726882089.19999: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/AnsiballZ_systemd.py" <<< 7557 1726882089.20039: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp4nm6k21r" to remote "/root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/AnsiballZ_systemd.py" <<< 7557 1726882089.20042: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/AnsiballZ_systemd.py" <<< 7557 1726882089.21104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882089.21144: stderr chunk (state=3): >>><<< 7557 1726882089.21147: stdout chunk (state=3): >>><<< 7557 1726882089.21167: done transferring module to remote 7557 1726882089.21176: _low_level_execute_command(): starting 7557 1726882089.21186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/ /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/AnsiballZ_systemd.py && sleep 0' 7557 1726882089.21647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.21651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.21653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.21655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.21710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882089.21713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882089.21719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882089.21765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882089.23547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882089.23576: stderr chunk (state=3): >>><<< 7557 1726882089.23581: stdout chunk (state=3): >>><<< 7557 1726882089.23595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882089.23600: _low_level_execute_command(): starting 7557 1726882089.23606: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/AnsiballZ_systemd.py && sleep 0' 7557 1726882089.24055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.24058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882089.24061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882089.24063: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.24065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.24116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882089.24121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882089.24176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882089.52962: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9371648", "MemoryPeak": "9871360", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315302400", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "92294000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "M<<< 7557 1726882089.52984: stdout chunk (state=3): >>>emoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service net<<< 7557 1726882089.52998: stdout chunk (state=3): >>>work.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7557 1726882089.54723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882089.54754: stderr chunk (state=3): >>><<< 7557 1726882089.54757: stdout chunk (state=3): >>><<< 7557 1726882089.54774: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9371648", "MemoryPeak": "9871360", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3315302400", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "92294000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service network.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882089.54898: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882089.54915: _low_level_execute_command(): starting 7557 1726882089.54918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882088.9484792-8211-40519577095979/ > /dev/null 2>&1 && sleep 0' 7557 1726882089.55369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.55373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.55375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882089.55377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882089.55379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.55432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882089.55435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882089.55439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882089.55483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882089.57257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882089.57282: stderr chunk (state=3): >>><<< 7557 1726882089.57287: stdout chunk (state=3): >>><<< 7557 1726882089.57304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882089.57310: handler run complete 7557 1726882089.57350: attempt loop complete, returning result 7557 1726882089.57353: _execute() done 7557 1726882089.57356: dumping result to json 7557 1726882089.57367: done dumping result, returning 7557 1726882089.57376: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-ed48-b3a5-000000000022] 7557 1726882089.57380: sending task result for task 12673a56-9f93-ed48-b3a5-000000000022 7557 1726882089.57617: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000022 7557 1726882089.57620: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882089.57668: no more pending results, returning what we have 7557 1726882089.57671: results queue empty 7557 1726882089.57672: checking for any_errors_fatal 7557 1726882089.57679: done checking for any_errors_fatal 7557 1726882089.57680: checking for max_fail_percentage 7557 1726882089.57681: done checking for max_fail_percentage 7557 1726882089.57682: checking to see if all hosts have failed and the running result is not ok 7557 1726882089.57683: done checking to see if all hosts have failed 7557 1726882089.57683: getting the remaining hosts for this loop 7557 1726882089.57685: done getting the remaining hosts for this loop 7557 1726882089.57688: getting the next task for host managed_node3 7557 1726882089.57702: done getting next task for host managed_node3 7557 1726882089.57706: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882089.57708: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882089.57718: getting variables 7557 1726882089.57719: in VariableManager get_vars() 7557 1726882089.57765: Calling all_inventory to load vars for managed_node3 7557 1726882089.57768: Calling groups_inventory to load vars for managed_node3 7557 1726882089.57770: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882089.57779: Calling all_plugins_play to load vars for managed_node3 7557 1726882089.57781: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882089.57784: Calling groups_plugins_play to load vars for managed_node3 7557 1726882089.58555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882089.59405: done with get_vars() 7557 1726882089.59421: done getting variables 7557 1726882089.59466: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:28:09 -0400 (0:00:00.744) 0:00:15.447 ****** 7557 1726882089.59489: entering _queue_task() for managed_node3/service 7557 1726882089.59712: worker is 1 (out of 1 available) 7557 1726882089.59726: exiting _queue_task() for managed_node3/service 7557 1726882089.59739: done queuing things up, now waiting for results queue to drain 7557 1726882089.59740: waiting for pending results... 7557 1726882089.59921: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882089.60017: in run() - task 12673a56-9f93-ed48-b3a5-000000000023 7557 1726882089.60029: variable 'ansible_search_path' from source: unknown 7557 1726882089.60032: variable 'ansible_search_path' from source: unknown 7557 1726882089.60062: calling self._execute() 7557 1726882089.60137: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882089.60141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882089.60149: variable 'omit' from source: magic vars 7557 1726882089.60429: variable 'ansible_distribution_major_version' from source: facts 7557 1726882089.60438: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882089.60520: variable 'network_provider' from source: set_fact 7557 1726882089.60524: Evaluated conditional (network_provider == "nm"): True 7557 1726882089.60586: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882089.60650: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882089.60766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882089.62410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882089.62455: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882089.62483: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882089.62512: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882089.62532: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882089.62595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882089.62617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882089.62635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882089.62661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882089.62671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882089.62711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882089.62727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882089.62743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882089.62767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882089.62777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882089.62813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882089.62830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882089.62846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882089.62869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882089.62879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882089.62975: variable 'network_connections' from source: task vars 7557 1726882089.62986: variable 'interface' from source: play vars 7557 1726882089.63046: variable 'interface' from source: play vars 7557 1726882089.63102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882089.63222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882089.63252: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882089.63274: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882089.63299: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882089.63330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882089.63349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882089.63366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882089.63384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882089.63424: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882089.63578: variable 'network_connections' from source: task vars 7557 1726882089.63582: variable 'interface' from source: play vars 7557 1726882089.63630: variable 'interface' from source: play vars 7557 1726882089.63662: Evaluated conditional (__network_wpa_supplicant_required): False 7557 1726882089.63666: when evaluation is False, skipping this task 7557 1726882089.63668: _execute() done 7557 1726882089.63671: dumping result to json 7557 1726882089.63673: done dumping result, returning 7557 1726882089.63683: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-ed48-b3a5-000000000023] 7557 1726882089.63696: sending task result for task 12673a56-9f93-ed48-b3a5-000000000023 7557 1726882089.63775: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000023 7557 1726882089.63778: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7557 1726882089.63825: no more pending results, returning what we have 7557 1726882089.63829: results queue empty 7557 1726882089.63830: checking for any_errors_fatal 7557 1726882089.63849: done checking for any_errors_fatal 7557 1726882089.63849: checking for max_fail_percentage 7557 1726882089.63851: done checking for max_fail_percentage 7557 1726882089.63852: checking to see if all hosts have failed and the running result is not ok 7557 1726882089.63853: done checking to see if all hosts have failed 7557 1726882089.63853: getting the remaining hosts for this loop 7557 1726882089.63855: done getting the remaining hosts for this loop 7557 1726882089.63858: getting the next task for host managed_node3 7557 1726882089.63865: done getting next task for host managed_node3 7557 1726882089.63869: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882089.63871: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882089.63884: getting variables 7557 1726882089.63886: in VariableManager get_vars() 7557 1726882089.63937: Calling all_inventory to load vars for managed_node3 7557 1726882089.63940: Calling groups_inventory to load vars for managed_node3 7557 1726882089.63942: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882089.63952: Calling all_plugins_play to load vars for managed_node3 7557 1726882089.63954: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882089.63957: Calling groups_plugins_play to load vars for managed_node3 7557 1726882089.64846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882089.65698: done with get_vars() 7557 1726882089.65717: done getting variables 7557 1726882089.65762: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:28:09 -0400 (0:00:00.062) 0:00:15.510 ****** 7557 1726882089.65785: entering _queue_task() for managed_node3/service 7557 1726882089.66029: worker is 1 (out of 1 available) 7557 1726882089.66042: exiting _queue_task() for managed_node3/service 7557 1726882089.66054: done queuing things up, now waiting for results queue to drain 7557 1726882089.66055: waiting for pending results... 7557 1726882089.66238: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882089.66339: in run() - task 12673a56-9f93-ed48-b3a5-000000000024 7557 1726882089.66352: variable 'ansible_search_path' from source: unknown 7557 1726882089.66355: variable 'ansible_search_path' from source: unknown 7557 1726882089.66387: calling self._execute() 7557 1726882089.66464: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882089.66468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882089.66476: variable 'omit' from source: magic vars 7557 1726882089.66761: variable 'ansible_distribution_major_version' from source: facts 7557 1726882089.66771: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882089.66854: variable 'network_provider' from source: set_fact 7557 1726882089.66860: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882089.66863: when evaluation is False, skipping this task 7557 1726882089.66866: _execute() done 7557 1726882089.66868: dumping result to json 7557 1726882089.66872: done dumping result, returning 7557 1726882089.66879: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-ed48-b3a5-000000000024] 7557 1726882089.66883: sending task result for task 12673a56-9f93-ed48-b3a5-000000000024 7557 1726882089.66974: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000024 7557 1726882089.66976: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882089.67020: no more pending results, returning what we have 7557 1726882089.67024: results queue empty 7557 1726882089.67025: checking for any_errors_fatal 7557 1726882089.67033: done checking for any_errors_fatal 7557 1726882089.67033: checking for max_fail_percentage 7557 1726882089.67035: done checking for max_fail_percentage 7557 1726882089.67036: checking to see if all hosts have failed and the running result is not ok 7557 1726882089.67037: done checking to see if all hosts have failed 7557 1726882089.67037: getting the remaining hosts for this loop 7557 1726882089.67039: done getting the remaining hosts for this loop 7557 1726882089.67042: getting the next task for host managed_node3 7557 1726882089.67048: done getting next task for host managed_node3 7557 1726882089.67052: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882089.67055: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882089.67071: getting variables 7557 1726882089.67073: in VariableManager get_vars() 7557 1726882089.67122: Calling all_inventory to load vars for managed_node3 7557 1726882089.67125: Calling groups_inventory to load vars for managed_node3 7557 1726882089.67127: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882089.67136: Calling all_plugins_play to load vars for managed_node3 7557 1726882089.67138: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882089.67141: Calling groups_plugins_play to load vars for managed_node3 7557 1726882089.67906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882089.68846: done with get_vars() 7557 1726882089.68862: done getting variables 7557 1726882089.68908: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:28:09 -0400 (0:00:00.031) 0:00:15.542 ****** 7557 1726882089.68933: entering _queue_task() for managed_node3/copy 7557 1726882089.69162: worker is 1 (out of 1 available) 7557 1726882089.69175: exiting _queue_task() for managed_node3/copy 7557 1726882089.69187: done queuing things up, now waiting for results queue to drain 7557 1726882089.69189: waiting for pending results... 7557 1726882089.69363: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882089.69463: in run() - task 12673a56-9f93-ed48-b3a5-000000000025 7557 1726882089.69475: variable 'ansible_search_path' from source: unknown 7557 1726882089.69479: variable 'ansible_search_path' from source: unknown 7557 1726882089.69514: calling self._execute() 7557 1726882089.69584: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882089.69589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882089.69602: variable 'omit' from source: magic vars 7557 1726882089.69874: variable 'ansible_distribution_major_version' from source: facts 7557 1726882089.69885: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882089.69967: variable 'network_provider' from source: set_fact 7557 1726882089.69972: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882089.69976: when evaluation is False, skipping this task 7557 1726882089.69979: _execute() done 7557 1726882089.69981: dumping result to json 7557 1726882089.69984: done dumping result, returning 7557 1726882089.69992: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-ed48-b3a5-000000000025] 7557 1726882089.70000: sending task result for task 12673a56-9f93-ed48-b3a5-000000000025 7557 1726882089.70084: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000025 7557 1726882089.70087: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7557 1726882089.70132: no more pending results, returning what we have 7557 1726882089.70136: results queue empty 7557 1726882089.70137: checking for any_errors_fatal 7557 1726882089.70144: done checking for any_errors_fatal 7557 1726882089.70145: checking for max_fail_percentage 7557 1726882089.70146: done checking for max_fail_percentage 7557 1726882089.70147: checking to see if all hosts have failed and the running result is not ok 7557 1726882089.70148: done checking to see if all hosts have failed 7557 1726882089.70148: getting the remaining hosts for this loop 7557 1726882089.70149: done getting the remaining hosts for this loop 7557 1726882089.70153: getting the next task for host managed_node3 7557 1726882089.70158: done getting next task for host managed_node3 7557 1726882089.70162: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882089.70166: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882089.70180: getting variables 7557 1726882089.70181: in VariableManager get_vars() 7557 1726882089.70228: Calling all_inventory to load vars for managed_node3 7557 1726882089.70231: Calling groups_inventory to load vars for managed_node3 7557 1726882089.70233: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882089.70241: Calling all_plugins_play to load vars for managed_node3 7557 1726882089.70243: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882089.70246: Calling groups_plugins_play to load vars for managed_node3 7557 1726882089.70996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882089.71844: done with get_vars() 7557 1726882089.71859: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:28:09 -0400 (0:00:00.029) 0:00:15.572 ****** 7557 1726882089.71920: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882089.71922: Creating lock for fedora.linux_system_roles.network_connections 7557 1726882089.72144: worker is 1 (out of 1 available) 7557 1726882089.72157: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882089.72169: done queuing things up, now waiting for results queue to drain 7557 1726882089.72171: waiting for pending results... 7557 1726882089.72349: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882089.72438: in run() - task 12673a56-9f93-ed48-b3a5-000000000026 7557 1726882089.72450: variable 'ansible_search_path' from source: unknown 7557 1726882089.72454: variable 'ansible_search_path' from source: unknown 7557 1726882089.72482: calling self._execute() 7557 1726882089.72559: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882089.72563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882089.72572: variable 'omit' from source: magic vars 7557 1726882089.72839: variable 'ansible_distribution_major_version' from source: facts 7557 1726882089.72849: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882089.72855: variable 'omit' from source: magic vars 7557 1726882089.72892: variable 'omit' from source: magic vars 7557 1726882089.73005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882089.74411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882089.74455: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882089.74485: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882089.74514: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882089.74534: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882089.74592: variable 'network_provider' from source: set_fact 7557 1726882089.74684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882089.74725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882089.74742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882089.74768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882089.74778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882089.74836: variable 'omit' from source: magic vars 7557 1726882089.74915: variable 'omit' from source: magic vars 7557 1726882089.74982: variable 'network_connections' from source: task vars 7557 1726882089.74992: variable 'interface' from source: play vars 7557 1726882089.75043: variable 'interface' from source: play vars 7557 1726882089.75157: variable 'omit' from source: magic vars 7557 1726882089.75165: variable '__lsr_ansible_managed' from source: task vars 7557 1726882089.75208: variable '__lsr_ansible_managed' from source: task vars 7557 1726882089.75609: Loaded config def from plugin (lookup/template) 7557 1726882089.75613: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7557 1726882089.75634: File lookup term: get_ansible_managed.j2 7557 1726882089.75637: variable 'ansible_search_path' from source: unknown 7557 1726882089.75642: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7557 1726882089.75652: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7557 1726882089.75669: variable 'ansible_search_path' from source: unknown 7557 1726882089.78685: variable 'ansible_managed' from source: unknown 7557 1726882089.78763: variable 'omit' from source: magic vars 7557 1726882089.78786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882089.78817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882089.78832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882089.78845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882089.78853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882089.78875: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882089.78878: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882089.78880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882089.78952: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882089.78958: Set connection var ansible_shell_executable to /bin/sh 7557 1726882089.78960: Set connection var ansible_shell_type to sh 7557 1726882089.78965: Set connection var ansible_pipelining to False 7557 1726882089.78968: Set connection var ansible_connection to ssh 7557 1726882089.78972: Set connection var ansible_timeout to 10 7557 1726882089.78988: variable 'ansible_shell_executable' from source: unknown 7557 1726882089.78991: variable 'ansible_connection' from source: unknown 7557 1726882089.78995: variable 'ansible_module_compression' from source: unknown 7557 1726882089.79000: variable 'ansible_shell_type' from source: unknown 7557 1726882089.79004: variable 'ansible_shell_executable' from source: unknown 7557 1726882089.79006: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882089.79011: variable 'ansible_pipelining' from source: unknown 7557 1726882089.79013: variable 'ansible_timeout' from source: unknown 7557 1726882089.79015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882089.79105: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882089.79117: variable 'omit' from source: magic vars 7557 1726882089.79120: starting attempt loop 7557 1726882089.79123: running the handler 7557 1726882089.79137: _low_level_execute_command(): starting 7557 1726882089.79143: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882089.79629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.79633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.79635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.79637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.79697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882089.79705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882089.79707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882089.79755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882089.81412: stdout chunk (state=3): >>>/root <<< 7557 1726882089.81511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882089.81542: stderr chunk (state=3): >>><<< 7557 1726882089.81545: stdout chunk (state=3): >>><<< 7557 1726882089.81563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882089.81574: _low_level_execute_command(): starting 7557 1726882089.81580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042 `" && echo ansible-tmp-1726882089.8156438-8229-37442474612042="` echo /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042 `" ) && sleep 0' 7557 1726882089.82031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882089.82034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.82036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882089.82038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882089.82040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882089.82091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882089.82101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882089.82103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882089.82145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882089.83992: stdout chunk (state=3): >>>ansible-tmp-1726882089.8156438-8229-37442474612042=/root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042 <<< 7557 1726882089.84095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882089.84125: stderr chunk (state=3): >>><<< 7557 1726882089.84128: stdout chunk (state=3): >>><<< 7557 1726882089.84142: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882089.8156438-8229-37442474612042=/root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882089.84179: variable 'ansible_module_compression' from source: unknown 7557 1726882089.84223: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 7557 1726882089.84227: ANSIBALLZ: Acquiring lock 7557 1726882089.84229: ANSIBALLZ: Lock acquired: 140194281272672 7557 1726882089.84232: ANSIBALLZ: Creating module 7557 1726882090.02495: ANSIBALLZ: Writing module into payload 7557 1726882090.02763: ANSIBALLZ: Writing module 7557 1726882090.02796: ANSIBALLZ: Renaming module 7557 1726882090.02809: ANSIBALLZ: Done creating module 7557 1726882090.02844: variable 'ansible_facts' from source: unknown 7557 1726882090.02961: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/AnsiballZ_network_connections.py 7557 1726882090.03116: Sending initial data 7557 1726882090.03152: Sent initial data (165 bytes) 7557 1726882090.03914: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882090.04022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882090.04048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882090.05686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882090.05726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882090.05774: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpqjnqurne /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/AnsiballZ_network_connections.py <<< 7557 1726882090.05777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/AnsiballZ_network_connections.py" <<< 7557 1726882090.05817: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpqjnqurne" to remote "/root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/AnsiballZ_network_connections.py" <<< 7557 1726882090.06675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882090.06678: stderr chunk (state=3): >>><<< 7557 1726882090.06680: stdout chunk (state=3): >>><<< 7557 1726882090.06698: done transferring module to remote 7557 1726882090.06707: _low_level_execute_command(): starting 7557 1726882090.06712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/ /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/AnsiballZ_network_connections.py && sleep 0' 7557 1726882090.07311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882090.07314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882090.07333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882090.07352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882090.07374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882090.07379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882090.07384: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882090.07421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882090.07486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882090.07490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882090.07555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882090.09299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882090.09335: stderr chunk (state=3): >>><<< 7557 1726882090.09338: stdout chunk (state=3): >>><<< 7557 1726882090.09351: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882090.09354: _low_level_execute_command(): starting 7557 1726882090.09361: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/AnsiballZ_network_connections.py && sleep 0' 7557 1726882090.10112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882090.10182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882090.83143: stdout chunk (state=3): >>> <<< 7557 1726882090.83161: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7557 1726882090.85033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882090.85066: stderr chunk (state=3): >>><<< 7557 1726882090.85069: stdout chunk (state=3): >>><<< 7557 1726882090.85090: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882090.85127: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': True, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1', 'route_metric4': 65535}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882090.85134: _low_level_execute_command(): starting 7557 1726882090.85139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882089.8156438-8229-37442474612042/ > /dev/null 2>&1 && sleep 0' 7557 1726882090.85704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882090.85714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882090.85726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882090.85776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882090.85783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882090.85785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882090.85832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882090.87771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882090.87778: stdout chunk (state=3): >>><<< 7557 1726882090.87780: stderr chunk (state=3): >>><<< 7557 1726882090.87796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882090.87949: handler run complete 7557 1726882090.87952: attempt loop complete, returning result 7557 1726882090.87954: _execute() done 7557 1726882090.87956: dumping result to json 7557 1726882090.87958: done dumping result, returning 7557 1726882090.87960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-ed48-b3a5-000000000026] 7557 1726882090.87961: sending task result for task 12673a56-9f93-ed48-b3a5-000000000026 7557 1726882090.88040: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000026 7557 1726882090.88043: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428 (not-active) 7557 1726882090.88179: no more pending results, returning what we have 7557 1726882090.88183: results queue empty 7557 1726882090.88184: checking for any_errors_fatal 7557 1726882090.88191: done checking for any_errors_fatal 7557 1726882090.88192: checking for max_fail_percentage 7557 1726882090.88196: done checking for max_fail_percentage 7557 1726882090.88197: checking to see if all hosts have failed and the running result is not ok 7557 1726882090.88198: done checking to see if all hosts have failed 7557 1726882090.88198: getting the remaining hosts for this loop 7557 1726882090.88200: done getting the remaining hosts for this loop 7557 1726882090.88203: getting the next task for host managed_node3 7557 1726882090.88209: done getting next task for host managed_node3 7557 1726882090.88213: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882090.88216: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882090.88226: getting variables 7557 1726882090.88228: in VariableManager get_vars() 7557 1726882090.88277: Calling all_inventory to load vars for managed_node3 7557 1726882090.88280: Calling groups_inventory to load vars for managed_node3 7557 1726882090.88284: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882090.88703: Calling all_plugins_play to load vars for managed_node3 7557 1726882090.88715: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882090.88719: Calling groups_plugins_play to load vars for managed_node3 7557 1726882090.91000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882090.92315: done with get_vars() 7557 1726882090.92338: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:28:10 -0400 (0:00:01.205) 0:00:16.777 ****** 7557 1726882090.92430: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882090.92432: Creating lock for fedora.linux_system_roles.network_state 7557 1726882090.92848: worker is 1 (out of 1 available) 7557 1726882090.92860: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882090.92938: done queuing things up, now waiting for results queue to drain 7557 1726882090.92941: waiting for pending results... 7557 1726882090.93098: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882090.93245: in run() - task 12673a56-9f93-ed48-b3a5-000000000027 7557 1726882090.93250: variable 'ansible_search_path' from source: unknown 7557 1726882090.93253: variable 'ansible_search_path' from source: unknown 7557 1726882090.93275: calling self._execute() 7557 1726882090.93377: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882090.93392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882090.93411: variable 'omit' from source: magic vars 7557 1726882090.93816: variable 'ansible_distribution_major_version' from source: facts 7557 1726882090.93898: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882090.93962: variable 'network_state' from source: role '' defaults 7557 1726882090.93979: Evaluated conditional (network_state != {}): False 7557 1726882090.93988: when evaluation is False, skipping this task 7557 1726882090.94004: _execute() done 7557 1726882090.94013: dumping result to json 7557 1726882090.94022: done dumping result, returning 7557 1726882090.94039: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-ed48-b3a5-000000000027] 7557 1726882090.94052: sending task result for task 12673a56-9f93-ed48-b3a5-000000000027 7557 1726882090.94245: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000027 7557 1726882090.94248: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882090.94304: no more pending results, returning what we have 7557 1726882090.94309: results queue empty 7557 1726882090.94310: checking for any_errors_fatal 7557 1726882090.94323: done checking for any_errors_fatal 7557 1726882090.94324: checking for max_fail_percentage 7557 1726882090.94326: done checking for max_fail_percentage 7557 1726882090.94326: checking to see if all hosts have failed and the running result is not ok 7557 1726882090.94328: done checking to see if all hosts have failed 7557 1726882090.94328: getting the remaining hosts for this loop 7557 1726882090.94330: done getting the remaining hosts for this loop 7557 1726882090.94333: getting the next task for host managed_node3 7557 1726882090.94340: done getting next task for host managed_node3 7557 1726882090.94345: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882090.94348: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882090.94366: getting variables 7557 1726882090.94368: in VariableManager get_vars() 7557 1726882090.94425: Calling all_inventory to load vars for managed_node3 7557 1726882090.94429: Calling groups_inventory to load vars for managed_node3 7557 1726882090.94432: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882090.94444: Calling all_plugins_play to load vars for managed_node3 7557 1726882090.94448: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882090.94451: Calling groups_plugins_play to load vars for managed_node3 7557 1726882090.96063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882090.97631: done with get_vars() 7557 1726882090.97668: done getting variables 7557 1726882090.97730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:28:10 -0400 (0:00:00.053) 0:00:16.830 ****** 7557 1726882090.97771: entering _queue_task() for managed_node3/debug 7557 1726882090.98221: worker is 1 (out of 1 available) 7557 1726882090.98232: exiting _queue_task() for managed_node3/debug 7557 1726882090.98242: done queuing things up, now waiting for results queue to drain 7557 1726882090.98243: waiting for pending results... 7557 1726882090.98510: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882090.98647: in run() - task 12673a56-9f93-ed48-b3a5-000000000028 7557 1726882090.98652: variable 'ansible_search_path' from source: unknown 7557 1726882090.98654: variable 'ansible_search_path' from source: unknown 7557 1726882090.98670: calling self._execute() 7557 1726882090.98781: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882090.98800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882090.98816: variable 'omit' from source: magic vars 7557 1726882090.99299: variable 'ansible_distribution_major_version' from source: facts 7557 1726882090.99303: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882090.99305: variable 'omit' from source: magic vars 7557 1726882090.99314: variable 'omit' from source: magic vars 7557 1726882090.99357: variable 'omit' from source: magic vars 7557 1726882090.99405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882090.99455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882090.99478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882090.99500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882090.99549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882090.99564: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882090.99572: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882090.99579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882090.99698: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882090.99711: Set connection var ansible_shell_executable to /bin/sh 7557 1726882090.99736: Set connection var ansible_shell_type to sh 7557 1726882090.99743: Set connection var ansible_pipelining to False 7557 1726882090.99745: Set connection var ansible_connection to ssh 7557 1726882090.99747: Set connection var ansible_timeout to 10 7557 1726882090.99772: variable 'ansible_shell_executable' from source: unknown 7557 1726882090.99847: variable 'ansible_connection' from source: unknown 7557 1726882090.99850: variable 'ansible_module_compression' from source: unknown 7557 1726882090.99852: variable 'ansible_shell_type' from source: unknown 7557 1726882090.99854: variable 'ansible_shell_executable' from source: unknown 7557 1726882090.99856: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882090.99858: variable 'ansible_pipelining' from source: unknown 7557 1726882090.99860: variable 'ansible_timeout' from source: unknown 7557 1726882090.99862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882090.99962: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882091.00061: variable 'omit' from source: magic vars 7557 1726882091.00064: starting attempt loop 7557 1726882091.00067: running the handler 7557 1726882091.00136: variable '__network_connections_result' from source: set_fact 7557 1726882091.00204: handler run complete 7557 1726882091.00225: attempt loop complete, returning result 7557 1726882091.00231: _execute() done 7557 1726882091.00237: dumping result to json 7557 1726882091.00244: done dumping result, returning 7557 1726882091.00256: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-ed48-b3a5-000000000028] 7557 1726882091.00266: sending task result for task 12673a56-9f93-ed48-b3a5-000000000028 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428 (not-active)" ] } 7557 1726882091.00467: no more pending results, returning what we have 7557 1726882091.00471: results queue empty 7557 1726882091.00472: checking for any_errors_fatal 7557 1726882091.00479: done checking for any_errors_fatal 7557 1726882091.00480: checking for max_fail_percentage 7557 1726882091.00481: done checking for max_fail_percentage 7557 1726882091.00482: checking to see if all hosts have failed and the running result is not ok 7557 1726882091.00483: done checking to see if all hosts have failed 7557 1726882091.00483: getting the remaining hosts for this loop 7557 1726882091.00485: done getting the remaining hosts for this loop 7557 1726882091.00488: getting the next task for host managed_node3 7557 1726882091.00496: done getting next task for host managed_node3 7557 1726882091.00501: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882091.00504: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882091.00517: getting variables 7557 1726882091.00519: in VariableManager get_vars() 7557 1726882091.00570: Calling all_inventory to load vars for managed_node3 7557 1726882091.00572: Calling groups_inventory to load vars for managed_node3 7557 1726882091.00575: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.00585: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.00588: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.00590: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.00817: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000028 7557 1726882091.00821: WORKER PROCESS EXITING 7557 1726882091.02395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.04008: done with get_vars() 7557 1726882091.04034: done getting variables 7557 1726882091.04095: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:28:11 -0400 (0:00:00.063) 0:00:16.894 ****** 7557 1726882091.04129: entering _queue_task() for managed_node3/debug 7557 1726882091.04463: worker is 1 (out of 1 available) 7557 1726882091.04476: exiting _queue_task() for managed_node3/debug 7557 1726882091.04489: done queuing things up, now waiting for results queue to drain 7557 1726882091.04490: waiting for pending results... 7557 1726882091.04785: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882091.04942: in run() - task 12673a56-9f93-ed48-b3a5-000000000029 7557 1726882091.04946: variable 'ansible_search_path' from source: unknown 7557 1726882091.04949: variable 'ansible_search_path' from source: unknown 7557 1726882091.05050: calling self._execute() 7557 1726882091.05055: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.05061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.05071: variable 'omit' from source: magic vars 7557 1726882091.05423: variable 'ansible_distribution_major_version' from source: facts 7557 1726882091.05436: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882091.05442: variable 'omit' from source: magic vars 7557 1726882091.05495: variable 'omit' from source: magic vars 7557 1726882091.05531: variable 'omit' from source: magic vars 7557 1726882091.05568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882091.05606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882091.05624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882091.05641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882091.05653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882091.05702: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882091.05706: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.05708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.05789: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882091.05810: Set connection var ansible_shell_executable to /bin/sh 7557 1726882091.05813: Set connection var ansible_shell_type to sh 7557 1726882091.05816: Set connection var ansible_pipelining to False 7557 1726882091.05818: Set connection var ansible_connection to ssh 7557 1726882091.05820: Set connection var ansible_timeout to 10 7557 1726882091.05899: variable 'ansible_shell_executable' from source: unknown 7557 1726882091.05902: variable 'ansible_connection' from source: unknown 7557 1726882091.05905: variable 'ansible_module_compression' from source: unknown 7557 1726882091.05907: variable 'ansible_shell_type' from source: unknown 7557 1726882091.05910: variable 'ansible_shell_executable' from source: unknown 7557 1726882091.05918: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.05920: variable 'ansible_pipelining' from source: unknown 7557 1726882091.05923: variable 'ansible_timeout' from source: unknown 7557 1726882091.05925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.06028: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882091.06032: variable 'omit' from source: magic vars 7557 1726882091.06040: starting attempt loop 7557 1726882091.06042: running the handler 7557 1726882091.06059: variable '__network_connections_result' from source: set_fact 7557 1726882091.06139: variable '__network_connections_result' from source: set_fact 7557 1726882091.06272: handler run complete 7557 1726882091.06365: attempt loop complete, returning result 7557 1726882091.06368: _execute() done 7557 1726882091.06371: dumping result to json 7557 1726882091.06373: done dumping result, returning 7557 1726882091.06376: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-ed48-b3a5-000000000029] 7557 1726882091.06378: sending task result for task 12673a56-9f93-ed48-b3a5-000000000029 7557 1726882091.06438: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000029 7557 1726882091.06441: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 24b01189-d26a-4e67-9260-0cb7eb810428 (not-active)" ] } } 7557 1726882091.06554: no more pending results, returning what we have 7557 1726882091.06557: results queue empty 7557 1726882091.06558: checking for any_errors_fatal 7557 1726882091.06562: done checking for any_errors_fatal 7557 1726882091.06563: checking for max_fail_percentage 7557 1726882091.06564: done checking for max_fail_percentage 7557 1726882091.06565: checking to see if all hosts have failed and the running result is not ok 7557 1726882091.06566: done checking to see if all hosts have failed 7557 1726882091.06566: getting the remaining hosts for this loop 7557 1726882091.06568: done getting the remaining hosts for this loop 7557 1726882091.06571: getting the next task for host managed_node3 7557 1726882091.06576: done getting next task for host managed_node3 7557 1726882091.06579: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882091.06581: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882091.06589: getting variables 7557 1726882091.06597: in VariableManager get_vars() 7557 1726882091.06638: Calling all_inventory to load vars for managed_node3 7557 1726882091.06644: Calling groups_inventory to load vars for managed_node3 7557 1726882091.06647: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.06655: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.06658: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.06660: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.07977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.09608: done with get_vars() 7557 1726882091.09633: done getting variables 7557 1726882091.09702: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:28:11 -0400 (0:00:00.056) 0:00:16.950 ****** 7557 1726882091.09737: entering _queue_task() for managed_node3/debug 7557 1726882091.10075: worker is 1 (out of 1 available) 7557 1726882091.10087: exiting _queue_task() for managed_node3/debug 7557 1726882091.10303: done queuing things up, now waiting for results queue to drain 7557 1726882091.10305: waiting for pending results... 7557 1726882091.10516: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882091.10541: in run() - task 12673a56-9f93-ed48-b3a5-00000000002a 7557 1726882091.10564: variable 'ansible_search_path' from source: unknown 7557 1726882091.10572: variable 'ansible_search_path' from source: unknown 7557 1726882091.10616: calling self._execute() 7557 1726882091.10723: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.10844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.10848: variable 'omit' from source: magic vars 7557 1726882091.11158: variable 'ansible_distribution_major_version' from source: facts 7557 1726882091.11181: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882091.11306: variable 'network_state' from source: role '' defaults 7557 1726882091.11321: Evaluated conditional (network_state != {}): False 7557 1726882091.11327: when evaluation is False, skipping this task 7557 1726882091.11333: _execute() done 7557 1726882091.11338: dumping result to json 7557 1726882091.11344: done dumping result, returning 7557 1726882091.11384: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-ed48-b3a5-00000000002a] 7557 1726882091.11387: sending task result for task 12673a56-9f93-ed48-b3a5-00000000002a skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7557 1726882091.11537: no more pending results, returning what we have 7557 1726882091.11542: results queue empty 7557 1726882091.11542: checking for any_errors_fatal 7557 1726882091.11552: done checking for any_errors_fatal 7557 1726882091.11553: checking for max_fail_percentage 7557 1726882091.11554: done checking for max_fail_percentage 7557 1726882091.11555: checking to see if all hosts have failed and the running result is not ok 7557 1726882091.11556: done checking to see if all hosts have failed 7557 1726882091.11556: getting the remaining hosts for this loop 7557 1726882091.11558: done getting the remaining hosts for this loop 7557 1726882091.11561: getting the next task for host managed_node3 7557 1726882091.11568: done getting next task for host managed_node3 7557 1726882091.11572: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882091.11576: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882091.11591: getting variables 7557 1726882091.11592: in VariableManager get_vars() 7557 1726882091.11748: Calling all_inventory to load vars for managed_node3 7557 1726882091.11751: Calling groups_inventory to load vars for managed_node3 7557 1726882091.11753: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.11764: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.11767: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.11769: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.12545: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000002a 7557 1726882091.12549: WORKER PROCESS EXITING 7557 1726882091.13544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.15307: done with get_vars() 7557 1726882091.15330: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:28:11 -0400 (0:00:00.056) 0:00:17.007 ****** 7557 1726882091.15429: entering _queue_task() for managed_node3/ping 7557 1726882091.15431: Creating lock for ping 7557 1726882091.15754: worker is 1 (out of 1 available) 7557 1726882091.15765: exiting _queue_task() for managed_node3/ping 7557 1726882091.15778: done queuing things up, now waiting for results queue to drain 7557 1726882091.15779: waiting for pending results... 7557 1726882091.16067: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882091.16204: in run() - task 12673a56-9f93-ed48-b3a5-00000000002b 7557 1726882091.16224: variable 'ansible_search_path' from source: unknown 7557 1726882091.16231: variable 'ansible_search_path' from source: unknown 7557 1726882091.16275: calling self._execute() 7557 1726882091.16376: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.16388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.16403: variable 'omit' from source: magic vars 7557 1726882091.16771: variable 'ansible_distribution_major_version' from source: facts 7557 1726882091.16787: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882091.16800: variable 'omit' from source: magic vars 7557 1726882091.16916: variable 'omit' from source: magic vars 7557 1726882091.16919: variable 'omit' from source: magic vars 7557 1726882091.16965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882091.17060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882091.17111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882091.17138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882091.17156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882091.17192: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882091.17203: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.17211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.17325: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882091.17337: Set connection var ansible_shell_executable to /bin/sh 7557 1726882091.17385: Set connection var ansible_shell_type to sh 7557 1726882091.17388: Set connection var ansible_pipelining to False 7557 1726882091.17391: Set connection var ansible_connection to ssh 7557 1726882091.17396: Set connection var ansible_timeout to 10 7557 1726882091.17404: variable 'ansible_shell_executable' from source: unknown 7557 1726882091.17413: variable 'ansible_connection' from source: unknown 7557 1726882091.17420: variable 'ansible_module_compression' from source: unknown 7557 1726882091.17427: variable 'ansible_shell_type' from source: unknown 7557 1726882091.17433: variable 'ansible_shell_executable' from source: unknown 7557 1726882091.17438: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.17445: variable 'ansible_pipelining' from source: unknown 7557 1726882091.17457: variable 'ansible_timeout' from source: unknown 7557 1726882091.17511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.17679: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882091.17699: variable 'omit' from source: magic vars 7557 1726882091.17710: starting attempt loop 7557 1726882091.17718: running the handler 7557 1726882091.17742: _low_level_execute_command(): starting 7557 1726882091.17753: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882091.18586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882091.18605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882091.18735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882091.18949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882091.20572: stdout chunk (state=3): >>>/root <<< 7557 1726882091.20848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882091.20852: stdout chunk (state=3): >>><<< 7557 1726882091.20855: stderr chunk (state=3): >>><<< 7557 1726882091.21086: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882091.21090: _low_level_execute_command(): starting 7557 1726882091.21095: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612 `" && echo ansible-tmp-1726882091.2088544-8269-50999460301612="` echo /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612 `" ) && sleep 0' 7557 1726882091.22228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882091.22346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882091.22468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882091.22504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882091.22586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882091.22632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882091.24726: stdout chunk (state=3): >>>ansible-tmp-1726882091.2088544-8269-50999460301612=/root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612 <<< 7557 1726882091.25099: stdout chunk (state=3): >>><<< 7557 1726882091.25104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882091.25106: stderr chunk (state=3): >>><<< 7557 1726882091.25108: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882091.2088544-8269-50999460301612=/root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882091.25111: variable 'ansible_module_compression' from source: unknown 7557 1726882091.25112: ANSIBALLZ: Using lock for ping 7557 1726882091.25114: ANSIBALLZ: Acquiring lock 7557 1726882091.25116: ANSIBALLZ: Lock acquired: 140194281477456 7557 1726882091.25118: ANSIBALLZ: Creating module 7557 1726882091.39408: ANSIBALLZ: Writing module into payload 7557 1726882091.39475: ANSIBALLZ: Writing module 7557 1726882091.39622: ANSIBALLZ: Renaming module 7557 1726882091.39636: ANSIBALLZ: Done creating module 7557 1726882091.39657: variable 'ansible_facts' from source: unknown 7557 1726882091.40021: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/AnsiballZ_ping.py 7557 1726882091.40215: Sending initial data 7557 1726882091.40224: Sent initial data (150 bytes) 7557 1726882091.41313: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882091.41599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882091.41617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882091.41696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882091.43227: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7557 1726882091.43241: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882091.43271: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882091.43539: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpkqn3d1ia /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/AnsiballZ_ping.py <<< 7557 1726882091.43543: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/AnsiballZ_ping.py" debug1: stat remote: No such file or directory <<< 7557 1726882091.43546: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpkqn3d1ia" to remote "/root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/AnsiballZ_ping.py" <<< 7557 1726882091.44897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882091.44913: stderr chunk (state=3): >>><<< 7557 1726882091.44946: stdout chunk (state=3): >>><<< 7557 1726882091.44973: done transferring module to remote 7557 1726882091.45181: _low_level_execute_command(): starting 7557 1726882091.45185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/ /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/AnsiballZ_ping.py && sleep 0' 7557 1726882091.46284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882091.46301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882091.46314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882091.46453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882091.46517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882091.46532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882091.46561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882091.46650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882091.48461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882091.48465: stderr chunk (state=3): >>><<< 7557 1726882091.48468: stdout chunk (state=3): >>><<< 7557 1726882091.48470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882091.48473: _low_level_execute_command(): starting 7557 1726882091.48475: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/AnsiballZ_ping.py && sleep 0' 7557 1726882091.49526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882091.49541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882091.49555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882091.49613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882091.49670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882091.49689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882091.49712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882091.49788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882091.64268: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7557 1726882091.65600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882091.65604: stdout chunk (state=3): >>><<< 7557 1726882091.65606: stderr chunk (state=3): >>><<< 7557 1726882091.65609: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882091.65612: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882091.65614: _low_level_execute_command(): starting 7557 1726882091.65616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882091.2088544-8269-50999460301612/ > /dev/null 2>&1 && sleep 0' 7557 1726882091.66150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882091.66214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882091.66226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882091.66239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882091.66307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882091.68100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882091.68104: stdout chunk (state=3): >>><<< 7557 1726882091.68112: stderr chunk (state=3): >>><<< 7557 1726882091.68142: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882091.68149: handler run complete 7557 1726882091.68165: attempt loop complete, returning result 7557 1726882091.68169: _execute() done 7557 1726882091.68171: dumping result to json 7557 1726882091.68174: done dumping result, returning 7557 1726882091.68183: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-ed48-b3a5-00000000002b] 7557 1726882091.68188: sending task result for task 12673a56-9f93-ed48-b3a5-00000000002b 7557 1726882091.68290: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000002b 7557 1726882091.68295: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7557 1726882091.68367: no more pending results, returning what we have 7557 1726882091.68372: results queue empty 7557 1726882091.68373: checking for any_errors_fatal 7557 1726882091.68379: done checking for any_errors_fatal 7557 1726882091.68380: checking for max_fail_percentage 7557 1726882091.68382: done checking for max_fail_percentage 7557 1726882091.68383: checking to see if all hosts have failed and the running result is not ok 7557 1726882091.68383: done checking to see if all hosts have failed 7557 1726882091.68384: getting the remaining hosts for this loop 7557 1726882091.68386: done getting the remaining hosts for this loop 7557 1726882091.68390: getting the next task for host managed_node3 7557 1726882091.68404: done getting next task for host managed_node3 7557 1726882091.68406: ^ task is: TASK: meta (role_complete) 7557 1726882091.68410: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882091.68423: getting variables 7557 1726882091.68425: in VariableManager get_vars() 7557 1726882091.68483: Calling all_inventory to load vars for managed_node3 7557 1726882091.68486: Calling groups_inventory to load vars for managed_node3 7557 1726882091.68489: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.68713: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.68725: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.68729: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.70270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.72168: done with get_vars() 7557 1726882091.72206: done getting variables 7557 1726882091.72299: done queuing things up, now waiting for results queue to drain 7557 1726882091.72302: results queue empty 7557 1726882091.72303: checking for any_errors_fatal 7557 1726882091.72312: done checking for any_errors_fatal 7557 1726882091.72313: checking for max_fail_percentage 7557 1726882091.72314: done checking for max_fail_percentage 7557 1726882091.72315: checking to see if all hosts have failed and the running result is not ok 7557 1726882091.72316: done checking to see if all hosts have failed 7557 1726882091.72317: getting the remaining hosts for this loop 7557 1726882091.72318: done getting the remaining hosts for this loop 7557 1726882091.72321: getting the next task for host managed_node3 7557 1726882091.72325: done getting next task for host managed_node3 7557 1726882091.72328: ^ task is: TASK: Include the task 'assert_device_present.yml' 7557 1726882091.72329: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882091.72332: getting variables 7557 1726882091.72333: in VariableManager get_vars() 7557 1726882091.72354: Calling all_inventory to load vars for managed_node3 7557 1726882091.72356: Calling groups_inventory to load vars for managed_node3 7557 1726882091.72358: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.72364: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.72366: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.72369: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.73468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.75108: done with get_vars() 7557 1726882091.75155: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:42 Friday 20 September 2024 21:28:11 -0400 (0:00:00.598) 0:00:17.605 ****** 7557 1726882091.75246: entering _queue_task() for managed_node3/include_tasks 7557 1726882091.75672: worker is 1 (out of 1 available) 7557 1726882091.75684: exiting _queue_task() for managed_node3/include_tasks 7557 1726882091.75707: done queuing things up, now waiting for results queue to drain 7557 1726882091.75709: waiting for pending results... 7557 1726882091.75959: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7557 1726882091.76099: in run() - task 12673a56-9f93-ed48-b3a5-00000000005b 7557 1726882091.76103: variable 'ansible_search_path' from source: unknown 7557 1726882091.76123: calling self._execute() 7557 1726882091.76219: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.76233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.76248: variable 'omit' from source: magic vars 7557 1726882091.76701: variable 'ansible_distribution_major_version' from source: facts 7557 1726882091.76704: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882091.76707: _execute() done 7557 1726882091.76709: dumping result to json 7557 1726882091.76712: done dumping result, returning 7557 1726882091.76714: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [12673a56-9f93-ed48-b3a5-00000000005b] 7557 1726882091.76716: sending task result for task 12673a56-9f93-ed48-b3a5-00000000005b 7557 1726882091.76784: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000005b 7557 1726882091.76788: WORKER PROCESS EXITING 7557 1726882091.76818: no more pending results, returning what we have 7557 1726882091.76824: in VariableManager get_vars() 7557 1726882091.76878: Calling all_inventory to load vars for managed_node3 7557 1726882091.76881: Calling groups_inventory to load vars for managed_node3 7557 1726882091.76883: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.76901: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.76904: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.76907: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.78333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.79745: done with get_vars() 7557 1726882091.79764: variable 'ansible_search_path' from source: unknown 7557 1726882091.79778: we have included files to process 7557 1726882091.79779: generating all_blocks data 7557 1726882091.79781: done generating all_blocks data 7557 1726882091.79786: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882091.79787: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882091.79790: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882091.79903: in VariableManager get_vars() 7557 1726882091.79934: done with get_vars() 7557 1726882091.80046: done processing included file 7557 1726882091.80048: iterating over new_blocks loaded from include file 7557 1726882091.80050: in VariableManager get_vars() 7557 1726882091.80071: done with get_vars() 7557 1726882091.80073: filtering new block on tags 7557 1726882091.80090: done filtering new block on tags 7557 1726882091.80092: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7557 1726882091.80100: extending task lists for all hosts with included blocks 7557 1726882091.84280: done extending task lists 7557 1726882091.84282: done processing included files 7557 1726882091.84283: results queue empty 7557 1726882091.84284: checking for any_errors_fatal 7557 1726882091.84286: done checking for any_errors_fatal 7557 1726882091.84287: checking for max_fail_percentage 7557 1726882091.84288: done checking for max_fail_percentage 7557 1726882091.84288: checking to see if all hosts have failed and the running result is not ok 7557 1726882091.84289: done checking to see if all hosts have failed 7557 1726882091.84290: getting the remaining hosts for this loop 7557 1726882091.84292: done getting the remaining hosts for this loop 7557 1726882091.84296: getting the next task for host managed_node3 7557 1726882091.84300: done getting next task for host managed_node3 7557 1726882091.84302: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7557 1726882091.84305: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882091.84308: getting variables 7557 1726882091.84309: in VariableManager get_vars() 7557 1726882091.84331: Calling all_inventory to load vars for managed_node3 7557 1726882091.84333: Calling groups_inventory to load vars for managed_node3 7557 1726882091.84335: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.84342: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.84344: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.84347: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.85501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.91416: done with get_vars() 7557 1726882091.91439: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:28:11 -0400 (0:00:00.162) 0:00:17.768 ****** 7557 1726882091.91513: entering _queue_task() for managed_node3/include_tasks 7557 1726882091.91842: worker is 1 (out of 1 available) 7557 1726882091.91854: exiting _queue_task() for managed_node3/include_tasks 7557 1726882091.91866: done queuing things up, now waiting for results queue to drain 7557 1726882091.91868: waiting for pending results... 7557 1726882091.92314: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7557 1726882091.92320: in run() - task 12673a56-9f93-ed48-b3a5-0000000008c2 7557 1726882091.92323: variable 'ansible_search_path' from source: unknown 7557 1726882091.92326: variable 'ansible_search_path' from source: unknown 7557 1726882091.92333: calling self._execute() 7557 1726882091.92433: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882091.92453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882091.92467: variable 'omit' from source: magic vars 7557 1726882091.92860: variable 'ansible_distribution_major_version' from source: facts 7557 1726882091.92883: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882091.92897: _execute() done 7557 1726882091.92906: dumping result to json 7557 1726882091.92914: done dumping result, returning 7557 1726882091.92925: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-ed48-b3a5-0000000008c2] 7557 1726882091.92936: sending task result for task 12673a56-9f93-ed48-b3a5-0000000008c2 7557 1726882091.93247: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000008c2 7557 1726882091.93250: WORKER PROCESS EXITING 7557 1726882091.93275: no more pending results, returning what we have 7557 1726882091.93280: in VariableManager get_vars() 7557 1726882091.93334: Calling all_inventory to load vars for managed_node3 7557 1726882091.93337: Calling groups_inventory to load vars for managed_node3 7557 1726882091.93339: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.93351: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.93354: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.93357: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.94688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.96190: done with get_vars() 7557 1726882091.96215: variable 'ansible_search_path' from source: unknown 7557 1726882091.96217: variable 'ansible_search_path' from source: unknown 7557 1726882091.96257: we have included files to process 7557 1726882091.96258: generating all_blocks data 7557 1726882091.96260: done generating all_blocks data 7557 1726882091.96262: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882091.96263: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882091.96265: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882091.96461: done processing included file 7557 1726882091.96463: iterating over new_blocks loaded from include file 7557 1726882091.96465: in VariableManager get_vars() 7557 1726882091.96494: done with get_vars() 7557 1726882091.96497: filtering new block on tags 7557 1726882091.96513: done filtering new block on tags 7557 1726882091.96515: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7557 1726882091.96521: extending task lists for all hosts with included blocks 7557 1726882091.96625: done extending task lists 7557 1726882091.96627: done processing included files 7557 1726882091.96628: results queue empty 7557 1726882091.96628: checking for any_errors_fatal 7557 1726882091.96631: done checking for any_errors_fatal 7557 1726882091.96632: checking for max_fail_percentage 7557 1726882091.96633: done checking for max_fail_percentage 7557 1726882091.96634: checking to see if all hosts have failed and the running result is not ok 7557 1726882091.96635: done checking to see if all hosts have failed 7557 1726882091.96635: getting the remaining hosts for this loop 7557 1726882091.96636: done getting the remaining hosts for this loop 7557 1726882091.96639: getting the next task for host managed_node3 7557 1726882091.96642: done getting next task for host managed_node3 7557 1726882091.96644: ^ task is: TASK: Get stat for interface {{ interface }} 7557 1726882091.96647: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882091.96649: getting variables 7557 1726882091.96650: in VariableManager get_vars() 7557 1726882091.96668: Calling all_inventory to load vars for managed_node3 7557 1726882091.96670: Calling groups_inventory to load vars for managed_node3 7557 1726882091.96672: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882091.96678: Calling all_plugins_play to load vars for managed_node3 7557 1726882091.96681: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882091.96684: Calling groups_plugins_play to load vars for managed_node3 7557 1726882091.97852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882091.99309: done with get_vars() 7557 1726882091.99330: done getting variables 7557 1726882091.99491: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:28:11 -0400 (0:00:00.080) 0:00:17.848 ****** 7557 1726882091.99525: entering _queue_task() for managed_node3/stat 7557 1726882091.99854: worker is 1 (out of 1 available) 7557 1726882091.99865: exiting _queue_task() for managed_node3/stat 7557 1726882091.99877: done queuing things up, now waiting for results queue to drain 7557 1726882091.99879: waiting for pending results... 7557 1726882092.00156: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7557 1726882092.00283: in run() - task 12673a56-9f93-ed48-b3a5-000000000ac6 7557 1726882092.00309: variable 'ansible_search_path' from source: unknown 7557 1726882092.00322: variable 'ansible_search_path' from source: unknown 7557 1726882092.00361: calling self._execute() 7557 1726882092.00465: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.00478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.00497: variable 'omit' from source: magic vars 7557 1726882092.00887: variable 'ansible_distribution_major_version' from source: facts 7557 1726882092.01001: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882092.01005: variable 'omit' from source: magic vars 7557 1726882092.01008: variable 'omit' from source: magic vars 7557 1726882092.01057: variable 'interface' from source: play vars 7557 1726882092.01077: variable 'omit' from source: magic vars 7557 1726882092.01123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882092.01159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882092.01183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882092.01199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.01214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.01245: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882092.01249: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.01252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.01360: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882092.01404: Set connection var ansible_shell_executable to /bin/sh 7557 1726882092.01408: Set connection var ansible_shell_type to sh 7557 1726882092.01410: Set connection var ansible_pipelining to False 7557 1726882092.01413: Set connection var ansible_connection to ssh 7557 1726882092.01416: Set connection var ansible_timeout to 10 7557 1726882092.01418: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.01425: variable 'ansible_connection' from source: unknown 7557 1726882092.01428: variable 'ansible_module_compression' from source: unknown 7557 1726882092.01430: variable 'ansible_shell_type' from source: unknown 7557 1726882092.01432: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.01435: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.01437: variable 'ansible_pipelining' from source: unknown 7557 1726882092.01439: variable 'ansible_timeout' from source: unknown 7557 1726882092.01441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.01700: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882092.01705: variable 'omit' from source: magic vars 7557 1726882092.01707: starting attempt loop 7557 1726882092.01710: running the handler 7557 1726882092.01712: _low_level_execute_command(): starting 7557 1726882092.01714: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882092.02418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882092.02509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.02525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.02535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.02639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.04309: stdout chunk (state=3): >>>/root <<< 7557 1726882092.04413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.04435: stderr chunk (state=3): >>><<< 7557 1726882092.04439: stdout chunk (state=3): >>><<< 7557 1726882092.04458: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.04470: _low_level_execute_command(): starting 7557 1726882092.04477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256 `" && echo ansible-tmp-1726882092.044583-8301-201140043401256="` echo /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256 `" ) && sleep 0' 7557 1726882092.04875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882092.04909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882092.04912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882092.04915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.04924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.04929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.04967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.04971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.05022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.06879: stdout chunk (state=3): >>>ansible-tmp-1726882092.044583-8301-201140043401256=/root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256 <<< 7557 1726882092.06980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.07011: stderr chunk (state=3): >>><<< 7557 1726882092.07014: stdout chunk (state=3): >>><<< 7557 1726882092.07029: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882092.044583-8301-201140043401256=/root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.07099: variable 'ansible_module_compression' from source: unknown 7557 1726882092.07114: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7557 1726882092.07149: variable 'ansible_facts' from source: unknown 7557 1726882092.07217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/AnsiballZ_stat.py 7557 1726882092.07316: Sending initial data 7557 1726882092.07319: Sent initial data (150 bytes) 7557 1726882092.07751: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.07756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882092.07758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882092.07762: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882092.07764: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.07816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.07820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.07825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.07870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.09365: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882092.09411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882092.09447: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpcaz6fntj /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/AnsiballZ_stat.py <<< 7557 1726882092.09454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/AnsiballZ_stat.py" <<< 7557 1726882092.09490: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpcaz6fntj" to remote "/root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/AnsiballZ_stat.py" <<< 7557 1726882092.09504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/AnsiballZ_stat.py" <<< 7557 1726882092.10049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.10091: stderr chunk (state=3): >>><<< 7557 1726882092.10096: stdout chunk (state=3): >>><<< 7557 1726882092.10116: done transferring module to remote 7557 1726882092.10124: _low_level_execute_command(): starting 7557 1726882092.10132: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/ /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/AnsiballZ_stat.py && sleep 0' 7557 1726882092.10559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882092.10562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882092.10564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.10566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.10572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.10623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.10630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.10672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.12354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.12379: stderr chunk (state=3): >>><<< 7557 1726882092.12382: stdout chunk (state=3): >>><<< 7557 1726882092.12400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.12403: _low_level_execute_command(): starting 7557 1726882092.12409: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/AnsiballZ_stat.py && sleep 0' 7557 1726882092.12829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.12832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.12836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.12881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.12884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.12951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.27751: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726882083.1308763, "mtime": 1726882083.1308763, "ctime": 1726882083.1308763, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7557 1726882092.29007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882092.29012: stdout chunk (state=3): >>><<< 7557 1726882092.29014: stderr chunk (state=3): >>><<< 7557 1726882092.29036: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25123, "dev": 23, "nlink": 1, "atime": 1726882083.1308763, "mtime": 1726882083.1308763, "ctime": 1726882083.1308763, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882092.29176: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882092.29185: _low_level_execute_command(): starting 7557 1726882092.29189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882092.044583-8301-201140043401256/ > /dev/null 2>&1 && sleep 0' 7557 1726882092.29723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882092.29738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882092.29752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.29771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882092.29789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882092.29805: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882092.29820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.29839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882092.29852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882092.29913: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.29952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.29970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.29998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.30076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.31982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.31992: stdout chunk (state=3): >>><<< 7557 1726882092.32009: stderr chunk (state=3): >>><<< 7557 1726882092.32031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.32043: handler run complete 7557 1726882092.32091: attempt loop complete, returning result 7557 1726882092.32101: _execute() done 7557 1726882092.32107: dumping result to json 7557 1726882092.32116: done dumping result, returning 7557 1726882092.32127: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [12673a56-9f93-ed48-b3a5-000000000ac6] 7557 1726882092.32135: sending task result for task 12673a56-9f93-ed48-b3a5-000000000ac6 ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882083.1308763, "block_size": 4096, "blocks": 0, "ctime": 1726882083.1308763, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25123, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882083.1308763, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7557 1726882092.32349: no more pending results, returning what we have 7557 1726882092.32352: results queue empty 7557 1726882092.32353: checking for any_errors_fatal 7557 1726882092.32355: done checking for any_errors_fatal 7557 1726882092.32355: checking for max_fail_percentage 7557 1726882092.32357: done checking for max_fail_percentage 7557 1726882092.32358: checking to see if all hosts have failed and the running result is not ok 7557 1726882092.32359: done checking to see if all hosts have failed 7557 1726882092.32359: getting the remaining hosts for this loop 7557 1726882092.32361: done getting the remaining hosts for this loop 7557 1726882092.32364: getting the next task for host managed_node3 7557 1726882092.32371: done getting next task for host managed_node3 7557 1726882092.32373: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7557 1726882092.32376: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882092.32381: getting variables 7557 1726882092.32382: in VariableManager get_vars() 7557 1726882092.32480: Calling all_inventory to load vars for managed_node3 7557 1726882092.32483: Calling groups_inventory to load vars for managed_node3 7557 1726882092.32485: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882092.32552: Calling all_plugins_play to load vars for managed_node3 7557 1726882092.32556: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882092.32560: Calling groups_plugins_play to load vars for managed_node3 7557 1726882092.33080: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000ac6 7557 1726882092.33083: WORKER PROCESS EXITING 7557 1726882092.33864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882092.35455: done with get_vars() 7557 1726882092.35484: done getting variables 7557 1726882092.35547: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882092.35665: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:28:12 -0400 (0:00:00.361) 0:00:18.209 ****** 7557 1726882092.35700: entering _queue_task() for managed_node3/assert 7557 1726882092.36048: worker is 1 (out of 1 available) 7557 1726882092.36060: exiting _queue_task() for managed_node3/assert 7557 1726882092.36073: done queuing things up, now waiting for results queue to drain 7557 1726882092.36074: waiting for pending results... 7557 1726882092.36399: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7557 1726882092.36529: in run() - task 12673a56-9f93-ed48-b3a5-0000000008c3 7557 1726882092.36543: variable 'ansible_search_path' from source: unknown 7557 1726882092.36546: variable 'ansible_search_path' from source: unknown 7557 1726882092.36592: calling self._execute() 7557 1726882092.36707: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.36713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.36722: variable 'omit' from source: magic vars 7557 1726882092.37152: variable 'ansible_distribution_major_version' from source: facts 7557 1726882092.37168: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882092.37171: variable 'omit' from source: magic vars 7557 1726882092.37223: variable 'omit' from source: magic vars 7557 1726882092.37326: variable 'interface' from source: play vars 7557 1726882092.37386: variable 'omit' from source: magic vars 7557 1726882092.37390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882092.37430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882092.37457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882092.37475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.37601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.37604: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882092.37608: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.37610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.37647: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882092.37654: Set connection var ansible_shell_executable to /bin/sh 7557 1726882092.37657: Set connection var ansible_shell_type to sh 7557 1726882092.37671: Set connection var ansible_pipelining to False 7557 1726882092.37674: Set connection var ansible_connection to ssh 7557 1726882092.37679: Set connection var ansible_timeout to 10 7557 1726882092.37708: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.37711: variable 'ansible_connection' from source: unknown 7557 1726882092.37714: variable 'ansible_module_compression' from source: unknown 7557 1726882092.37717: variable 'ansible_shell_type' from source: unknown 7557 1726882092.37719: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.37722: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.37724: variable 'ansible_pipelining' from source: unknown 7557 1726882092.37726: variable 'ansible_timeout' from source: unknown 7557 1726882092.37728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.37875: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882092.37896: variable 'omit' from source: magic vars 7557 1726882092.37906: starting attempt loop 7557 1726882092.37909: running the handler 7557 1726882092.38060: variable 'interface_stat' from source: set_fact 7557 1726882092.38078: Evaluated conditional (interface_stat.stat.exists): True 7557 1726882092.38084: handler run complete 7557 1726882092.38110: attempt loop complete, returning result 7557 1726882092.38113: _execute() done 7557 1726882092.38116: dumping result to json 7557 1726882092.38118: done dumping result, returning 7557 1726882092.38125: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [12673a56-9f93-ed48-b3a5-0000000008c3] 7557 1726882092.38130: sending task result for task 12673a56-9f93-ed48-b3a5-0000000008c3 7557 1726882092.38222: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000008c3 7557 1726882092.38224: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882092.38291: no more pending results, returning what we have 7557 1726882092.38297: results queue empty 7557 1726882092.38298: checking for any_errors_fatal 7557 1726882092.38308: done checking for any_errors_fatal 7557 1726882092.38309: checking for max_fail_percentage 7557 1726882092.38310: done checking for max_fail_percentage 7557 1726882092.38311: checking to see if all hosts have failed and the running result is not ok 7557 1726882092.38312: done checking to see if all hosts have failed 7557 1726882092.38313: getting the remaining hosts for this loop 7557 1726882092.38314: done getting the remaining hosts for this loop 7557 1726882092.38318: getting the next task for host managed_node3 7557 1726882092.38325: done getting next task for host managed_node3 7557 1726882092.38328: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7557 1726882092.38330: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882092.38335: getting variables 7557 1726882092.38336: in VariableManager get_vars() 7557 1726882092.38386: Calling all_inventory to load vars for managed_node3 7557 1726882092.38389: Calling groups_inventory to load vars for managed_node3 7557 1726882092.38391: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882092.38506: Calling all_plugins_play to load vars for managed_node3 7557 1726882092.38510: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882092.38513: Calling groups_plugins_play to load vars for managed_node3 7557 1726882092.39919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882092.41352: done with get_vars() 7557 1726882092.41377: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:44 Friday 20 September 2024 21:28:12 -0400 (0:00:00.057) 0:00:18.267 ****** 7557 1726882092.41474: entering _queue_task() for managed_node3/include_tasks 7557 1726882092.41812: worker is 1 (out of 1 available) 7557 1726882092.41824: exiting _queue_task() for managed_node3/include_tasks 7557 1726882092.41836: done queuing things up, now waiting for results queue to drain 7557 1726882092.41838: waiting for pending results... 7557 1726882092.42220: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7557 1726882092.42226: in run() - task 12673a56-9f93-ed48-b3a5-00000000005c 7557 1726882092.42232: variable 'ansible_search_path' from source: unknown 7557 1726882092.42274: calling self._execute() 7557 1726882092.42381: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.42398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.42426: variable 'omit' from source: magic vars 7557 1726882092.42861: variable 'ansible_distribution_major_version' from source: facts 7557 1726882092.42864: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882092.42867: _execute() done 7557 1726882092.42870: dumping result to json 7557 1726882092.42872: done dumping result, returning 7557 1726882092.42875: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [12673a56-9f93-ed48-b3a5-00000000005c] 7557 1726882092.42877: sending task result for task 12673a56-9f93-ed48-b3a5-00000000005c 7557 1726882092.42989: no more pending results, returning what we have 7557 1726882092.42997: in VariableManager get_vars() 7557 1726882092.43059: Calling all_inventory to load vars for managed_node3 7557 1726882092.43062: Calling groups_inventory to load vars for managed_node3 7557 1726882092.43065: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882092.43079: Calling all_plugins_play to load vars for managed_node3 7557 1726882092.43082: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882092.43086: Calling groups_plugins_play to load vars for managed_node3 7557 1726882092.43907: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000005c 7557 1726882092.43910: WORKER PROCESS EXITING 7557 1726882092.44667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882092.46220: done with get_vars() 7557 1726882092.46238: variable 'ansible_search_path' from source: unknown 7557 1726882092.46252: we have included files to process 7557 1726882092.46253: generating all_blocks data 7557 1726882092.46255: done generating all_blocks data 7557 1726882092.46259: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7557 1726882092.46260: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7557 1726882092.46262: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7557 1726882092.46460: in VariableManager get_vars() 7557 1726882092.46489: done with get_vars() 7557 1726882092.46737: done processing included file 7557 1726882092.46739: iterating over new_blocks loaded from include file 7557 1726882092.46741: in VariableManager get_vars() 7557 1726882092.46764: done with get_vars() 7557 1726882092.46766: filtering new block on tags 7557 1726882092.46787: done filtering new block on tags 7557 1726882092.46789: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7557 1726882092.46796: extending task lists for all hosts with included blocks 7557 1726882092.51077: done extending task lists 7557 1726882092.51079: done processing included files 7557 1726882092.51079: results queue empty 7557 1726882092.51080: checking for any_errors_fatal 7557 1726882092.51082: done checking for any_errors_fatal 7557 1726882092.51083: checking for max_fail_percentage 7557 1726882092.51083: done checking for max_fail_percentage 7557 1726882092.51084: checking to see if all hosts have failed and the running result is not ok 7557 1726882092.51085: done checking to see if all hosts have failed 7557 1726882092.51085: getting the remaining hosts for this loop 7557 1726882092.51086: done getting the remaining hosts for this loop 7557 1726882092.51089: getting the next task for host managed_node3 7557 1726882092.51091: done getting next task for host managed_node3 7557 1726882092.51095: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7557 1726882092.51097: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882092.51098: getting variables 7557 1726882092.51099: in VariableManager get_vars() 7557 1726882092.51115: Calling all_inventory to load vars for managed_node3 7557 1726882092.51116: Calling groups_inventory to load vars for managed_node3 7557 1726882092.51118: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882092.51122: Calling all_plugins_play to load vars for managed_node3 7557 1726882092.51124: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882092.51125: Calling groups_plugins_play to load vars for managed_node3 7557 1726882092.51784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882092.52745: done with get_vars() 7557 1726882092.52768: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:28:12 -0400 (0:00:00.113) 0:00:18.381 ****** 7557 1726882092.52850: entering _queue_task() for managed_node3/include_tasks 7557 1726882092.53187: worker is 1 (out of 1 available) 7557 1726882092.53202: exiting _queue_task() for managed_node3/include_tasks 7557 1726882092.53216: done queuing things up, now waiting for results queue to drain 7557 1726882092.53218: waiting for pending results... 7557 1726882092.53552: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7557 1726882092.53644: in run() - task 12673a56-9f93-ed48-b3a5-000000000ade 7557 1726882092.53659: variable 'ansible_search_path' from source: unknown 7557 1726882092.53662: variable 'ansible_search_path' from source: unknown 7557 1726882092.53701: calling self._execute() 7557 1726882092.53782: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.53787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.53799: variable 'omit' from source: magic vars 7557 1726882092.54075: variable 'ansible_distribution_major_version' from source: facts 7557 1726882092.54087: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882092.54090: _execute() done 7557 1726882092.54097: dumping result to json 7557 1726882092.54100: done dumping result, returning 7557 1726882092.54104: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-ed48-b3a5-000000000ade] 7557 1726882092.54110: sending task result for task 12673a56-9f93-ed48-b3a5-000000000ade 7557 1726882092.54191: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000ade 7557 1726882092.54198: WORKER PROCESS EXITING 7557 1726882092.54226: no more pending results, returning what we have 7557 1726882092.54232: in VariableManager get_vars() 7557 1726882092.54300: Calling all_inventory to load vars for managed_node3 7557 1726882092.54303: Calling groups_inventory to load vars for managed_node3 7557 1726882092.54310: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882092.54322: Calling all_plugins_play to load vars for managed_node3 7557 1726882092.54324: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882092.54327: Calling groups_plugins_play to load vars for managed_node3 7557 1726882092.55217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882092.56569: done with get_vars() 7557 1726882092.56599: variable 'ansible_search_path' from source: unknown 7557 1726882092.56601: variable 'ansible_search_path' from source: unknown 7557 1726882092.56640: we have included files to process 7557 1726882092.56642: generating all_blocks data 7557 1726882092.56643: done generating all_blocks data 7557 1726882092.56644: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7557 1726882092.56645: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7557 1726882092.56647: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7557 1726882092.57726: done processing included file 7557 1726882092.57729: iterating over new_blocks loaded from include file 7557 1726882092.57730: in VariableManager get_vars() 7557 1726882092.57757: done with get_vars() 7557 1726882092.57759: filtering new block on tags 7557 1726882092.57782: done filtering new block on tags 7557 1726882092.57784: in VariableManager get_vars() 7557 1726882092.57812: done with get_vars() 7557 1726882092.57814: filtering new block on tags 7557 1726882092.57834: done filtering new block on tags 7557 1726882092.57836: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7557 1726882092.57842: extending task lists for all hosts with included blocks 7557 1726882092.58000: done extending task lists 7557 1726882092.58001: done processing included files 7557 1726882092.58002: results queue empty 7557 1726882092.58003: checking for any_errors_fatal 7557 1726882092.58006: done checking for any_errors_fatal 7557 1726882092.58006: checking for max_fail_percentage 7557 1726882092.58007: done checking for max_fail_percentage 7557 1726882092.58008: checking to see if all hosts have failed and the running result is not ok 7557 1726882092.58009: done checking to see if all hosts have failed 7557 1726882092.58010: getting the remaining hosts for this loop 7557 1726882092.58011: done getting the remaining hosts for this loop 7557 1726882092.58013: getting the next task for host managed_node3 7557 1726882092.58016: done getting next task for host managed_node3 7557 1726882092.58019: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7557 1726882092.58022: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882092.58024: getting variables 7557 1726882092.58025: in VariableManager get_vars() 7557 1726882092.58108: Calling all_inventory to load vars for managed_node3 7557 1726882092.58111: Calling groups_inventory to load vars for managed_node3 7557 1726882092.58113: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882092.58120: Calling all_plugins_play to load vars for managed_node3 7557 1726882092.58122: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882092.58125: Calling groups_plugins_play to load vars for managed_node3 7557 1726882092.59258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882092.60915: done with get_vars() 7557 1726882092.60938: done getting variables 7557 1726882092.60984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:28:12 -0400 (0:00:00.081) 0:00:18.463 ****** 7557 1726882092.61019: entering _queue_task() for managed_node3/set_fact 7557 1726882092.61359: worker is 1 (out of 1 available) 7557 1726882092.61373: exiting _queue_task() for managed_node3/set_fact 7557 1726882092.61386: done queuing things up, now waiting for results queue to drain 7557 1726882092.61387: waiting for pending results... 7557 1726882092.61819: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7557 1726882092.61825: in run() - task 12673a56-9f93-ed48-b3a5-000000000cef 7557 1726882092.61829: variable 'ansible_search_path' from source: unknown 7557 1726882092.61832: variable 'ansible_search_path' from source: unknown 7557 1726882092.61950: calling self._execute() 7557 1726882092.61972: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.61978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.61989: variable 'omit' from source: magic vars 7557 1726882092.62360: variable 'ansible_distribution_major_version' from source: facts 7557 1726882092.62372: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882092.62380: variable 'omit' from source: magic vars 7557 1726882092.62434: variable 'omit' from source: magic vars 7557 1726882092.62470: variable 'omit' from source: magic vars 7557 1726882092.62516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882092.62608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882092.62612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882092.62615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.62617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.62635: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882092.62639: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.62641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.62755: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882092.62763: Set connection var ansible_shell_executable to /bin/sh 7557 1726882092.62766: Set connection var ansible_shell_type to sh 7557 1726882092.62770: Set connection var ansible_pipelining to False 7557 1726882092.62773: Set connection var ansible_connection to ssh 7557 1726882092.62779: Set connection var ansible_timeout to 10 7557 1726882092.62801: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.62804: variable 'ansible_connection' from source: unknown 7557 1726882092.62807: variable 'ansible_module_compression' from source: unknown 7557 1726882092.62809: variable 'ansible_shell_type' from source: unknown 7557 1726882092.62812: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.62814: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.62817: variable 'ansible_pipelining' from source: unknown 7557 1726882092.62932: variable 'ansible_timeout' from source: unknown 7557 1726882092.62935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.62979: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882092.62990: variable 'omit' from source: magic vars 7557 1726882092.62997: starting attempt loop 7557 1726882092.63001: running the handler 7557 1726882092.63012: handler run complete 7557 1726882092.63022: attempt loop complete, returning result 7557 1726882092.63025: _execute() done 7557 1726882092.63027: dumping result to json 7557 1726882092.63030: done dumping result, returning 7557 1726882092.63049: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-ed48-b3a5-000000000cef] 7557 1726882092.63052: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cef 7557 1726882092.63135: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cef 7557 1726882092.63138: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7557 1726882092.63205: no more pending results, returning what we have 7557 1726882092.63209: results queue empty 7557 1726882092.63210: checking for any_errors_fatal 7557 1726882092.63211: done checking for any_errors_fatal 7557 1726882092.63212: checking for max_fail_percentage 7557 1726882092.63214: done checking for max_fail_percentage 7557 1726882092.63214: checking to see if all hosts have failed and the running result is not ok 7557 1726882092.63215: done checking to see if all hosts have failed 7557 1726882092.63216: getting the remaining hosts for this loop 7557 1726882092.63217: done getting the remaining hosts for this loop 7557 1726882092.63221: getting the next task for host managed_node3 7557 1726882092.63227: done getting next task for host managed_node3 7557 1726882092.63229: ^ task is: TASK: Stat profile file 7557 1726882092.63234: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882092.63237: getting variables 7557 1726882092.63238: in VariableManager get_vars() 7557 1726882092.63287: Calling all_inventory to load vars for managed_node3 7557 1726882092.63290: Calling groups_inventory to load vars for managed_node3 7557 1726882092.63297: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882092.63306: Calling all_plugins_play to load vars for managed_node3 7557 1726882092.63309: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882092.63311: Calling groups_plugins_play to load vars for managed_node3 7557 1726882092.64665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882092.66299: done with get_vars() 7557 1726882092.66329: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:28:12 -0400 (0:00:00.054) 0:00:18.517 ****** 7557 1726882092.66435: entering _queue_task() for managed_node3/stat 7557 1726882092.66791: worker is 1 (out of 1 available) 7557 1726882092.66804: exiting _queue_task() for managed_node3/stat 7557 1726882092.66816: done queuing things up, now waiting for results queue to drain 7557 1726882092.66817: waiting for pending results... 7557 1726882092.67317: running TaskExecutor() for managed_node3/TASK: Stat profile file 7557 1726882092.67324: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf0 7557 1726882092.67329: variable 'ansible_search_path' from source: unknown 7557 1726882092.67333: variable 'ansible_search_path' from source: unknown 7557 1726882092.67337: calling self._execute() 7557 1726882092.67389: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.67398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.67408: variable 'omit' from source: magic vars 7557 1726882092.67825: variable 'ansible_distribution_major_version' from source: facts 7557 1726882092.67836: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882092.67843: variable 'omit' from source: magic vars 7557 1726882092.67904: variable 'omit' from source: magic vars 7557 1726882092.68015: variable 'profile' from source: include params 7557 1726882092.68018: variable 'interface' from source: play vars 7557 1726882092.68102: variable 'interface' from source: play vars 7557 1726882092.68123: variable 'omit' from source: magic vars 7557 1726882092.68166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882092.68216: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882092.68234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882092.68253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.68265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882092.68400: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882092.68403: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.68406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.68433: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882092.68441: Set connection var ansible_shell_executable to /bin/sh 7557 1726882092.68444: Set connection var ansible_shell_type to sh 7557 1726882092.68449: Set connection var ansible_pipelining to False 7557 1726882092.68452: Set connection var ansible_connection to ssh 7557 1726882092.68458: Set connection var ansible_timeout to 10 7557 1726882092.68481: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.68484: variable 'ansible_connection' from source: unknown 7557 1726882092.68487: variable 'ansible_module_compression' from source: unknown 7557 1726882092.68489: variable 'ansible_shell_type' from source: unknown 7557 1726882092.68492: variable 'ansible_shell_executable' from source: unknown 7557 1726882092.68498: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882092.68500: variable 'ansible_pipelining' from source: unknown 7557 1726882092.68503: variable 'ansible_timeout' from source: unknown 7557 1726882092.68508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882092.68724: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882092.68733: variable 'omit' from source: magic vars 7557 1726882092.68737: starting attempt loop 7557 1726882092.68752: running the handler 7557 1726882092.68765: _low_level_execute_command(): starting 7557 1726882092.68774: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882092.69612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.69664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.69668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.69670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.69755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.71450: stdout chunk (state=3): >>>/root <<< 7557 1726882092.71618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.71622: stdout chunk (state=3): >>><<< 7557 1726882092.71625: stderr chunk (state=3): >>><<< 7557 1726882092.71646: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.71667: _low_level_execute_command(): starting 7557 1726882092.71679: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928 `" && echo ansible-tmp-1726882092.7165358-8332-219051633844928="` echo /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928 `" ) && sleep 0' 7557 1726882092.72318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882092.72333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882092.72362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.72379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882092.72399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882092.72469: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.72527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.72545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.72579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.72655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.74562: stdout chunk (state=3): >>>ansible-tmp-1726882092.7165358-8332-219051633844928=/root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928 <<< 7557 1726882092.74709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.74722: stderr chunk (state=3): >>><<< 7557 1726882092.74731: stdout chunk (state=3): >>><<< 7557 1726882092.74761: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882092.7165358-8332-219051633844928=/root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.74824: variable 'ansible_module_compression' from source: unknown 7557 1726882092.74985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7557 1726882092.74988: variable 'ansible_facts' from source: unknown 7557 1726882092.75033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/AnsiballZ_stat.py 7557 1726882092.75233: Sending initial data 7557 1726882092.75242: Sent initial data (151 bytes) 7557 1726882092.75868: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882092.75912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.75929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882092.76025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.76072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.76117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.77675: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882092.77710: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882092.77755: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpscixage8 /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/AnsiballZ_stat.py <<< 7557 1726882092.77761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/AnsiballZ_stat.py" <<< 7557 1726882092.77810: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpscixage8" to remote "/root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/AnsiballZ_stat.py" <<< 7557 1726882092.78358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.78402: stderr chunk (state=3): >>><<< 7557 1726882092.78405: stdout chunk (state=3): >>><<< 7557 1726882092.78438: done transferring module to remote 7557 1726882092.78449: _low_level_execute_command(): starting 7557 1726882092.78454: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/ /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/AnsiballZ_stat.py && sleep 0' 7557 1726882092.78870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.78899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882092.78903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.78905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882092.78908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882092.78916: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.78965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.78969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.79032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.80783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.80787: stdout chunk (state=3): >>><<< 7557 1726882092.80789: stderr chunk (state=3): >>><<< 7557 1726882092.80818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.80822: _low_level_execute_command(): starting 7557 1726882092.80825: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/AnsiballZ_stat.py && sleep 0' 7557 1726882092.81279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.81282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.81284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.81288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.81328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.81341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.81406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.96245: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7557 1726882092.97388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882092.97417: stderr chunk (state=3): >>><<< 7557 1726882092.97421: stdout chunk (state=3): >>><<< 7557 1726882092.97437: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882092.97459: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882092.97467: _low_level_execute_command(): starting 7557 1726882092.97475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882092.7165358-8332-219051633844928/ > /dev/null 2>&1 && sleep 0' 7557 1726882092.97934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.97938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.97940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882092.97942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882092.97991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882092.98001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882092.98003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882092.98045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882092.99798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882092.99823: stderr chunk (state=3): >>><<< 7557 1726882092.99826: stdout chunk (state=3): >>><<< 7557 1726882092.99839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882092.99845: handler run complete 7557 1726882092.99864: attempt loop complete, returning result 7557 1726882092.99867: _execute() done 7557 1726882092.99871: dumping result to json 7557 1726882092.99874: done dumping result, returning 7557 1726882092.99881: done running TaskExecutor() for managed_node3/TASK: Stat profile file [12673a56-9f93-ed48-b3a5-000000000cf0] 7557 1726882092.99886: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf0 7557 1726882092.99976: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf0 7557 1726882092.99979: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7557 1726882093.00037: no more pending results, returning what we have 7557 1726882093.00040: results queue empty 7557 1726882093.00041: checking for any_errors_fatal 7557 1726882093.00049: done checking for any_errors_fatal 7557 1726882093.00049: checking for max_fail_percentage 7557 1726882093.00051: done checking for max_fail_percentage 7557 1726882093.00052: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.00053: done checking to see if all hosts have failed 7557 1726882093.00053: getting the remaining hosts for this loop 7557 1726882093.00055: done getting the remaining hosts for this loop 7557 1726882093.00058: getting the next task for host managed_node3 7557 1726882093.00064: done getting next task for host managed_node3 7557 1726882093.00066: ^ task is: TASK: Set NM profile exist flag based on the profile files 7557 1726882093.00070: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.00074: getting variables 7557 1726882093.00075: in VariableManager get_vars() 7557 1726882093.00129: Calling all_inventory to load vars for managed_node3 7557 1726882093.00132: Calling groups_inventory to load vars for managed_node3 7557 1726882093.00134: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.00145: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.00147: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.00150: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.01063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.01914: done with get_vars() 7557 1726882093.01932: done getting variables 7557 1726882093.01974: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:28:13 -0400 (0:00:00.355) 0:00:18.872 ****** 7557 1726882093.02000: entering _queue_task() for managed_node3/set_fact 7557 1726882093.02239: worker is 1 (out of 1 available) 7557 1726882093.02253: exiting _queue_task() for managed_node3/set_fact 7557 1726882093.02267: done queuing things up, now waiting for results queue to drain 7557 1726882093.02268: waiting for pending results... 7557 1726882093.02438: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7557 1726882093.02520: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf1 7557 1726882093.02530: variable 'ansible_search_path' from source: unknown 7557 1726882093.02533: variable 'ansible_search_path' from source: unknown 7557 1726882093.02561: calling self._execute() 7557 1726882093.02637: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.02641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.02651: variable 'omit' from source: magic vars 7557 1726882093.02929: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.02938: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.03021: variable 'profile_stat' from source: set_fact 7557 1726882093.03033: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882093.03036: when evaluation is False, skipping this task 7557 1726882093.03040: _execute() done 7557 1726882093.03044: dumping result to json 7557 1726882093.03047: done dumping result, returning 7557 1726882093.03049: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-ed48-b3a5-000000000cf1] 7557 1726882093.03051: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf1 7557 1726882093.03137: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf1 7557 1726882093.03140: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882093.03207: no more pending results, returning what we have 7557 1726882093.03210: results queue empty 7557 1726882093.03211: checking for any_errors_fatal 7557 1726882093.03217: done checking for any_errors_fatal 7557 1726882093.03218: checking for max_fail_percentage 7557 1726882093.03220: done checking for max_fail_percentage 7557 1726882093.03221: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.03221: done checking to see if all hosts have failed 7557 1726882093.03222: getting the remaining hosts for this loop 7557 1726882093.03224: done getting the remaining hosts for this loop 7557 1726882093.03226: getting the next task for host managed_node3 7557 1726882093.03232: done getting next task for host managed_node3 7557 1726882093.03234: ^ task is: TASK: Get NM profile info 7557 1726882093.03237: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.03241: getting variables 7557 1726882093.03242: in VariableManager get_vars() 7557 1726882093.03282: Calling all_inventory to load vars for managed_node3 7557 1726882093.03284: Calling groups_inventory to load vars for managed_node3 7557 1726882093.03286: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.03300: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.03302: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.03305: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.04062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.04917: done with get_vars() 7557 1726882093.04937: done getting variables 7557 1726882093.05012: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:28:13 -0400 (0:00:00.030) 0:00:18.903 ****** 7557 1726882093.05036: entering _queue_task() for managed_node3/shell 7557 1726882093.05037: Creating lock for shell 7557 1726882093.05298: worker is 1 (out of 1 available) 7557 1726882093.05312: exiting _queue_task() for managed_node3/shell 7557 1726882093.05326: done queuing things up, now waiting for results queue to drain 7557 1726882093.05328: waiting for pending results... 7557 1726882093.05502: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7557 1726882093.05568: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf2 7557 1726882093.05582: variable 'ansible_search_path' from source: unknown 7557 1726882093.05585: variable 'ansible_search_path' from source: unknown 7557 1726882093.05617: calling self._execute() 7557 1726882093.05701: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.05704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.05711: variable 'omit' from source: magic vars 7557 1726882093.05987: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.06002: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.06006: variable 'omit' from source: magic vars 7557 1726882093.06041: variable 'omit' from source: magic vars 7557 1726882093.06114: variable 'profile' from source: include params 7557 1726882093.06118: variable 'interface' from source: play vars 7557 1726882093.06169: variable 'interface' from source: play vars 7557 1726882093.06184: variable 'omit' from source: magic vars 7557 1726882093.06221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882093.06248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882093.06264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882093.06277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.06287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.06314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882093.06318: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.06322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.06391: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882093.06398: Set connection var ansible_shell_executable to /bin/sh 7557 1726882093.06402: Set connection var ansible_shell_type to sh 7557 1726882093.06407: Set connection var ansible_pipelining to False 7557 1726882093.06409: Set connection var ansible_connection to ssh 7557 1726882093.06415: Set connection var ansible_timeout to 10 7557 1726882093.06439: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.06442: variable 'ansible_connection' from source: unknown 7557 1726882093.06444: variable 'ansible_module_compression' from source: unknown 7557 1726882093.06447: variable 'ansible_shell_type' from source: unknown 7557 1726882093.06449: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.06451: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.06453: variable 'ansible_pipelining' from source: unknown 7557 1726882093.06455: variable 'ansible_timeout' from source: unknown 7557 1726882093.06457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.06556: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882093.06564: variable 'omit' from source: magic vars 7557 1726882093.06569: starting attempt loop 7557 1726882093.06572: running the handler 7557 1726882093.06582: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882093.06602: _low_level_execute_command(): starting 7557 1726882093.06609: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882093.07135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.07138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.07141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.07144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.07199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.07203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.07205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.07271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.08844: stdout chunk (state=3): >>>/root <<< 7557 1726882093.08942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.08969: stderr chunk (state=3): >>><<< 7557 1726882093.08973: stdout chunk (state=3): >>><<< 7557 1726882093.08996: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882093.09012: _low_level_execute_command(): starting 7557 1726882093.09018: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984 `" && echo ansible-tmp-1726882093.0899956-8345-225204438009984="` echo /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984 `" ) && sleep 0' 7557 1726882093.09458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.09497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882093.09501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.09503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.09506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.09508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.09554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.09557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.09559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.09612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.11432: stdout chunk (state=3): >>>ansible-tmp-1726882093.0899956-8345-225204438009984=/root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984 <<< 7557 1726882093.11537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.11563: stderr chunk (state=3): >>><<< 7557 1726882093.11566: stdout chunk (state=3): >>><<< 7557 1726882093.11581: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882093.0899956-8345-225204438009984=/root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882093.11613: variable 'ansible_module_compression' from source: unknown 7557 1726882093.11659: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882093.11690: variable 'ansible_facts' from source: unknown 7557 1726882093.11743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/AnsiballZ_command.py 7557 1726882093.11850: Sending initial data 7557 1726882093.11853: Sent initial data (154 bytes) 7557 1726882093.12285: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.12320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882093.12327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.12330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.12333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.12381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.12388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.12390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.12438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.13938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882093.13981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882093.14027: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpn0jphmv6 /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/AnsiballZ_command.py <<< 7557 1726882093.14031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/AnsiballZ_command.py" <<< 7557 1726882093.14072: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpn0jphmv6" to remote "/root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/AnsiballZ_command.py" <<< 7557 1726882093.14077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/AnsiballZ_command.py" <<< 7557 1726882093.14613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.14652: stderr chunk (state=3): >>><<< 7557 1726882093.14655: stdout chunk (state=3): >>><<< 7557 1726882093.14698: done transferring module to remote 7557 1726882093.14710: _low_level_execute_command(): starting 7557 1726882093.14713: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/ /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/AnsiballZ_command.py && sleep 0' 7557 1726882093.15172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.15180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.15182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.15184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882093.15186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.15228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.15231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.15286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.16958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.16983: stderr chunk (state=3): >>><<< 7557 1726882093.16988: stdout chunk (state=3): >>><<< 7557 1726882093.17010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882093.17013: _low_level_execute_command(): starting 7557 1726882093.17018: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/AnsiballZ_command.py && sleep 0' 7557 1726882093.17457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.17461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882093.17491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.17499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882093.17502: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882093.17504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.17550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.17554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.17563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.17621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.41368: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:28:13.325038", "end": "2024-09-20 21:28:13.411894", "delta": "0:00:00.086856", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882093.42843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882093.42873: stderr chunk (state=3): >>><<< 7557 1726882093.42876: stdout chunk (state=3): >>><<< 7557 1726882093.42897: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:28:13.325038", "end": "2024-09-20 21:28:13.411894", "delta": "0:00:00.086856", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882093.42929: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882093.42937: _low_level_execute_command(): starting 7557 1726882093.42942: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882093.0899956-8345-225204438009984/ > /dev/null 2>&1 && sleep 0' 7557 1726882093.43383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.43425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882093.43428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.43430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.43432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882093.43434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.43479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.43483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.43485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.43542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.45306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.45333: stderr chunk (state=3): >>><<< 7557 1726882093.45336: stdout chunk (state=3): >>><<< 7557 1726882093.45355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882093.45362: handler run complete 7557 1726882093.45379: Evaluated conditional (False): False 7557 1726882093.45387: attempt loop complete, returning result 7557 1726882093.45390: _execute() done 7557 1726882093.45395: dumping result to json 7557 1726882093.45398: done dumping result, returning 7557 1726882093.45406: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [12673a56-9f93-ed48-b3a5-000000000cf2] 7557 1726882093.45411: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf2 7557 1726882093.45509: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf2 7557 1726882093.45512: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.086856", "end": "2024-09-20 21:28:13.411894", "rc": 0, "start": "2024-09-20 21:28:13.325038" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7557 1726882093.45577: no more pending results, returning what we have 7557 1726882093.45580: results queue empty 7557 1726882093.45581: checking for any_errors_fatal 7557 1726882093.45587: done checking for any_errors_fatal 7557 1726882093.45588: checking for max_fail_percentage 7557 1726882093.45589: done checking for max_fail_percentage 7557 1726882093.45590: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.45591: done checking to see if all hosts have failed 7557 1726882093.45591: getting the remaining hosts for this loop 7557 1726882093.45597: done getting the remaining hosts for this loop 7557 1726882093.45600: getting the next task for host managed_node3 7557 1726882093.45607: done getting next task for host managed_node3 7557 1726882093.45610: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7557 1726882093.45614: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.45618: getting variables 7557 1726882093.45619: in VariableManager get_vars() 7557 1726882093.45672: Calling all_inventory to load vars for managed_node3 7557 1726882093.45674: Calling groups_inventory to load vars for managed_node3 7557 1726882093.45676: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.45686: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.45689: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.45691: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.46615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.47457: done with get_vars() 7557 1726882093.47475: done getting variables 7557 1726882093.47521: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:28:13 -0400 (0:00:00.425) 0:00:19.328 ****** 7557 1726882093.47546: entering _queue_task() for managed_node3/set_fact 7557 1726882093.47789: worker is 1 (out of 1 available) 7557 1726882093.47808: exiting _queue_task() for managed_node3/set_fact 7557 1726882093.47822: done queuing things up, now waiting for results queue to drain 7557 1726882093.47823: waiting for pending results... 7557 1726882093.47998: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7557 1726882093.48079: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf3 7557 1726882093.48091: variable 'ansible_search_path' from source: unknown 7557 1726882093.48100: variable 'ansible_search_path' from source: unknown 7557 1726882093.48125: calling self._execute() 7557 1726882093.48204: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.48208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.48219: variable 'omit' from source: magic vars 7557 1726882093.48491: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.48504: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.48591: variable 'nm_profile_exists' from source: set_fact 7557 1726882093.48606: Evaluated conditional (nm_profile_exists.rc == 0): True 7557 1726882093.48611: variable 'omit' from source: magic vars 7557 1726882093.48644: variable 'omit' from source: magic vars 7557 1726882093.48665: variable 'omit' from source: magic vars 7557 1726882093.48699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882093.48729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882093.48745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882093.48759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.48768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.48792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882093.48799: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.48801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.48872: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882093.48878: Set connection var ansible_shell_executable to /bin/sh 7557 1726882093.48881: Set connection var ansible_shell_type to sh 7557 1726882093.48886: Set connection var ansible_pipelining to False 7557 1726882093.48888: Set connection var ansible_connection to ssh 7557 1726882093.48897: Set connection var ansible_timeout to 10 7557 1726882093.48912: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.48914: variable 'ansible_connection' from source: unknown 7557 1726882093.48917: variable 'ansible_module_compression' from source: unknown 7557 1726882093.48919: variable 'ansible_shell_type' from source: unknown 7557 1726882093.48923: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.48926: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.48928: variable 'ansible_pipelining' from source: unknown 7557 1726882093.48930: variable 'ansible_timeout' from source: unknown 7557 1726882093.48933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.49034: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882093.49045: variable 'omit' from source: magic vars 7557 1726882093.49048: starting attempt loop 7557 1726882093.49050: running the handler 7557 1726882093.49063: handler run complete 7557 1726882093.49071: attempt loop complete, returning result 7557 1726882093.49073: _execute() done 7557 1726882093.49076: dumping result to json 7557 1726882093.49078: done dumping result, returning 7557 1726882093.49086: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-ed48-b3a5-000000000cf3] 7557 1726882093.49090: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf3 7557 1726882093.49169: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf3 7557 1726882093.49173: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7557 1726882093.49230: no more pending results, returning what we have 7557 1726882093.49233: results queue empty 7557 1726882093.49234: checking for any_errors_fatal 7557 1726882093.49244: done checking for any_errors_fatal 7557 1726882093.49244: checking for max_fail_percentage 7557 1726882093.49246: done checking for max_fail_percentage 7557 1726882093.49247: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.49247: done checking to see if all hosts have failed 7557 1726882093.49248: getting the remaining hosts for this loop 7557 1726882093.49250: done getting the remaining hosts for this loop 7557 1726882093.49253: getting the next task for host managed_node3 7557 1726882093.49262: done getting next task for host managed_node3 7557 1726882093.49264: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7557 1726882093.49269: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.49272: getting variables 7557 1726882093.49274: in VariableManager get_vars() 7557 1726882093.49335: Calling all_inventory to load vars for managed_node3 7557 1726882093.49338: Calling groups_inventory to load vars for managed_node3 7557 1726882093.49340: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.49350: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.49352: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.49355: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.50125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.50969: done with get_vars() 7557 1726882093.50984: done getting variables 7557 1726882093.51028: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882093.51113: variable 'profile' from source: include params 7557 1726882093.51116: variable 'interface' from source: play vars 7557 1726882093.51161: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:28:13 -0400 (0:00:00.036) 0:00:19.364 ****** 7557 1726882093.51188: entering _queue_task() for managed_node3/command 7557 1726882093.51412: worker is 1 (out of 1 available) 7557 1726882093.51424: exiting _queue_task() for managed_node3/command 7557 1726882093.51437: done queuing things up, now waiting for results queue to drain 7557 1726882093.51439: waiting for pending results... 7557 1726882093.51609: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7557 1726882093.51680: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf5 7557 1726882093.51696: variable 'ansible_search_path' from source: unknown 7557 1726882093.51701: variable 'ansible_search_path' from source: unknown 7557 1726882093.51724: calling self._execute() 7557 1726882093.51804: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.51807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.51815: variable 'omit' from source: magic vars 7557 1726882093.52068: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.52078: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.52161: variable 'profile_stat' from source: set_fact 7557 1726882093.52173: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882093.52176: when evaluation is False, skipping this task 7557 1726882093.52178: _execute() done 7557 1726882093.52181: dumping result to json 7557 1726882093.52183: done dumping result, returning 7557 1726882093.52189: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000000cf5] 7557 1726882093.52197: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf5 7557 1726882093.52273: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf5 7557 1726882093.52276: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882093.52358: no more pending results, returning what we have 7557 1726882093.52361: results queue empty 7557 1726882093.52362: checking for any_errors_fatal 7557 1726882093.52365: done checking for any_errors_fatal 7557 1726882093.52366: checking for max_fail_percentage 7557 1726882093.52368: done checking for max_fail_percentage 7557 1726882093.52368: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.52369: done checking to see if all hosts have failed 7557 1726882093.52370: getting the remaining hosts for this loop 7557 1726882093.52371: done getting the remaining hosts for this loop 7557 1726882093.52374: getting the next task for host managed_node3 7557 1726882093.52379: done getting next task for host managed_node3 7557 1726882093.52381: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7557 1726882093.52387: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.52390: getting variables 7557 1726882093.52391: in VariableManager get_vars() 7557 1726882093.52435: Calling all_inventory to load vars for managed_node3 7557 1726882093.52438: Calling groups_inventory to load vars for managed_node3 7557 1726882093.52440: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.52448: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.52450: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.52453: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.53274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.54124: done with get_vars() 7557 1726882093.54139: done getting variables 7557 1726882093.54179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882093.54256: variable 'profile' from source: include params 7557 1726882093.54260: variable 'interface' from source: play vars 7557 1726882093.54301: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:28:13 -0400 (0:00:00.031) 0:00:19.396 ****** 7557 1726882093.54323: entering _queue_task() for managed_node3/set_fact 7557 1726882093.54546: worker is 1 (out of 1 available) 7557 1726882093.54561: exiting _queue_task() for managed_node3/set_fact 7557 1726882093.54574: done queuing things up, now waiting for results queue to drain 7557 1726882093.54575: waiting for pending results... 7557 1726882093.54744: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7557 1726882093.54813: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf6 7557 1726882093.54825: variable 'ansible_search_path' from source: unknown 7557 1726882093.54829: variable 'ansible_search_path' from source: unknown 7557 1726882093.54856: calling self._execute() 7557 1726882093.54934: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.54937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.54946: variable 'omit' from source: magic vars 7557 1726882093.55209: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.55218: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.55302: variable 'profile_stat' from source: set_fact 7557 1726882093.55312: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882093.55316: when evaluation is False, skipping this task 7557 1726882093.55318: _execute() done 7557 1726882093.55321: dumping result to json 7557 1726882093.55324: done dumping result, returning 7557 1726882093.55330: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000000cf6] 7557 1726882093.55335: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf6 7557 1726882093.55421: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf6 7557 1726882093.55423: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882093.55491: no more pending results, returning what we have 7557 1726882093.55498: results queue empty 7557 1726882093.55499: checking for any_errors_fatal 7557 1726882093.55503: done checking for any_errors_fatal 7557 1726882093.55504: checking for max_fail_percentage 7557 1726882093.55506: done checking for max_fail_percentage 7557 1726882093.55506: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.55507: done checking to see if all hosts have failed 7557 1726882093.55508: getting the remaining hosts for this loop 7557 1726882093.55509: done getting the remaining hosts for this loop 7557 1726882093.55512: getting the next task for host managed_node3 7557 1726882093.55519: done getting next task for host managed_node3 7557 1726882093.55521: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7557 1726882093.55524: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.55528: getting variables 7557 1726882093.55529: in VariableManager get_vars() 7557 1726882093.55568: Calling all_inventory to load vars for managed_node3 7557 1726882093.55570: Calling groups_inventory to load vars for managed_node3 7557 1726882093.55572: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.55582: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.55585: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.55587: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.56348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.57283: done with get_vars() 7557 1726882093.57303: done getting variables 7557 1726882093.57345: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882093.57428: variable 'profile' from source: include params 7557 1726882093.57431: variable 'interface' from source: play vars 7557 1726882093.57471: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:28:13 -0400 (0:00:00.031) 0:00:19.427 ****** 7557 1726882093.57498: entering _queue_task() for managed_node3/command 7557 1726882093.57732: worker is 1 (out of 1 available) 7557 1726882093.57744: exiting _queue_task() for managed_node3/command 7557 1726882093.57758: done queuing things up, now waiting for results queue to drain 7557 1726882093.57759: waiting for pending results... 7557 1726882093.57932: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7557 1726882093.58010: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf7 7557 1726882093.58022: variable 'ansible_search_path' from source: unknown 7557 1726882093.58027: variable 'ansible_search_path' from source: unknown 7557 1726882093.58054: calling self._execute() 7557 1726882093.58134: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.58138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.58146: variable 'omit' from source: magic vars 7557 1726882093.58407: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.58417: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.58500: variable 'profile_stat' from source: set_fact 7557 1726882093.58511: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882093.58514: when evaluation is False, skipping this task 7557 1726882093.58516: _execute() done 7557 1726882093.58518: dumping result to json 7557 1726882093.58523: done dumping result, returning 7557 1726882093.58528: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000000cf7] 7557 1726882093.58532: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf7 7557 1726882093.58615: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf7 7557 1726882093.58617: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882093.58677: no more pending results, returning what we have 7557 1726882093.58682: results queue empty 7557 1726882093.58683: checking for any_errors_fatal 7557 1726882093.58690: done checking for any_errors_fatal 7557 1726882093.58691: checking for max_fail_percentage 7557 1726882093.58692: done checking for max_fail_percentage 7557 1726882093.58696: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.58697: done checking to see if all hosts have failed 7557 1726882093.58698: getting the remaining hosts for this loop 7557 1726882093.58699: done getting the remaining hosts for this loop 7557 1726882093.58702: getting the next task for host managed_node3 7557 1726882093.58708: done getting next task for host managed_node3 7557 1726882093.58710: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7557 1726882093.58713: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.58716: getting variables 7557 1726882093.58718: in VariableManager get_vars() 7557 1726882093.58759: Calling all_inventory to load vars for managed_node3 7557 1726882093.58761: Calling groups_inventory to load vars for managed_node3 7557 1726882093.58764: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.58773: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.58775: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.58777: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.59541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.60401: done with get_vars() 7557 1726882093.60419: done getting variables 7557 1726882093.60463: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882093.60548: variable 'profile' from source: include params 7557 1726882093.60551: variable 'interface' from source: play vars 7557 1726882093.60589: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:28:13 -0400 (0:00:00.031) 0:00:19.459 ****** 7557 1726882093.60616: entering _queue_task() for managed_node3/set_fact 7557 1726882093.60852: worker is 1 (out of 1 available) 7557 1726882093.60866: exiting _queue_task() for managed_node3/set_fact 7557 1726882093.60881: done queuing things up, now waiting for results queue to drain 7557 1726882093.60882: waiting for pending results... 7557 1726882093.61055: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7557 1726882093.61128: in run() - task 12673a56-9f93-ed48-b3a5-000000000cf8 7557 1726882093.61138: variable 'ansible_search_path' from source: unknown 7557 1726882093.61142: variable 'ansible_search_path' from source: unknown 7557 1726882093.61169: calling self._execute() 7557 1726882093.61250: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.61255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.61264: variable 'omit' from source: magic vars 7557 1726882093.61513: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.61523: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.61608: variable 'profile_stat' from source: set_fact 7557 1726882093.61619: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882093.61623: when evaluation is False, skipping this task 7557 1726882093.61626: _execute() done 7557 1726882093.61628: dumping result to json 7557 1726882093.61630: done dumping result, returning 7557 1726882093.61636: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000000cf8] 7557 1726882093.61641: sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf8 7557 1726882093.61724: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000cf8 7557 1726882093.61727: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882093.61802: no more pending results, returning what we have 7557 1726882093.61805: results queue empty 7557 1726882093.61806: checking for any_errors_fatal 7557 1726882093.61810: done checking for any_errors_fatal 7557 1726882093.61811: checking for max_fail_percentage 7557 1726882093.61813: done checking for max_fail_percentage 7557 1726882093.61813: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.61814: done checking to see if all hosts have failed 7557 1726882093.61815: getting the remaining hosts for this loop 7557 1726882093.61816: done getting the remaining hosts for this loop 7557 1726882093.61819: getting the next task for host managed_node3 7557 1726882093.61827: done getting next task for host managed_node3 7557 1726882093.61829: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7557 1726882093.61831: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.61835: getting variables 7557 1726882093.61836: in VariableManager get_vars() 7557 1726882093.61877: Calling all_inventory to load vars for managed_node3 7557 1726882093.61879: Calling groups_inventory to load vars for managed_node3 7557 1726882093.61881: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.61890: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.61896: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.61899: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.62749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.63583: done with get_vars() 7557 1726882093.63602: done getting variables 7557 1726882093.63643: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882093.63721: variable 'profile' from source: include params 7557 1726882093.63724: variable 'interface' from source: play vars 7557 1726882093.63761: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:28:13 -0400 (0:00:00.031) 0:00:19.490 ****** 7557 1726882093.63784: entering _queue_task() for managed_node3/assert 7557 1726882093.64004: worker is 1 (out of 1 available) 7557 1726882093.64017: exiting _queue_task() for managed_node3/assert 7557 1726882093.64031: done queuing things up, now waiting for results queue to drain 7557 1726882093.64033: waiting for pending results... 7557 1726882093.64205: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7557 1726882093.64280: in run() - task 12673a56-9f93-ed48-b3a5-000000000adf 7557 1726882093.64291: variable 'ansible_search_path' from source: unknown 7557 1726882093.64296: variable 'ansible_search_path' from source: unknown 7557 1726882093.64330: calling self._execute() 7557 1726882093.64411: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.64415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.64425: variable 'omit' from source: magic vars 7557 1726882093.64696: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.64708: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.64714: variable 'omit' from source: magic vars 7557 1726882093.64743: variable 'omit' from source: magic vars 7557 1726882093.64816: variable 'profile' from source: include params 7557 1726882093.64820: variable 'interface' from source: play vars 7557 1726882093.64867: variable 'interface' from source: play vars 7557 1726882093.64883: variable 'omit' from source: magic vars 7557 1726882093.64918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882093.64944: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882093.64959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882093.64972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.64984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.65012: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882093.65016: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.65018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.65087: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882093.65101: Set connection var ansible_shell_executable to /bin/sh 7557 1726882093.65104: Set connection var ansible_shell_type to sh 7557 1726882093.65106: Set connection var ansible_pipelining to False 7557 1726882093.65109: Set connection var ansible_connection to ssh 7557 1726882093.65114: Set connection var ansible_timeout to 10 7557 1726882093.65132: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.65135: variable 'ansible_connection' from source: unknown 7557 1726882093.65138: variable 'ansible_module_compression' from source: unknown 7557 1726882093.65140: variable 'ansible_shell_type' from source: unknown 7557 1726882093.65143: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.65145: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.65147: variable 'ansible_pipelining' from source: unknown 7557 1726882093.65150: variable 'ansible_timeout' from source: unknown 7557 1726882093.65152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.65253: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882093.65261: variable 'omit' from source: magic vars 7557 1726882093.65266: starting attempt loop 7557 1726882093.65268: running the handler 7557 1726882093.65347: variable 'lsr_net_profile_exists' from source: set_fact 7557 1726882093.65351: Evaluated conditional (lsr_net_profile_exists): True 7557 1726882093.65356: handler run complete 7557 1726882093.65367: attempt loop complete, returning result 7557 1726882093.65369: _execute() done 7557 1726882093.65372: dumping result to json 7557 1726882093.65374: done dumping result, returning 7557 1726882093.65381: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [12673a56-9f93-ed48-b3a5-000000000adf] 7557 1726882093.65385: sending task result for task 12673a56-9f93-ed48-b3a5-000000000adf 7557 1726882093.65462: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000adf 7557 1726882093.65465: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882093.65517: no more pending results, returning what we have 7557 1726882093.65520: results queue empty 7557 1726882093.65521: checking for any_errors_fatal 7557 1726882093.65527: done checking for any_errors_fatal 7557 1726882093.65528: checking for max_fail_percentage 7557 1726882093.65529: done checking for max_fail_percentage 7557 1726882093.65530: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.65531: done checking to see if all hosts have failed 7557 1726882093.65531: getting the remaining hosts for this loop 7557 1726882093.65533: done getting the remaining hosts for this loop 7557 1726882093.65536: getting the next task for host managed_node3 7557 1726882093.65541: done getting next task for host managed_node3 7557 1726882093.65544: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7557 1726882093.65546: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.65550: getting variables 7557 1726882093.65551: in VariableManager get_vars() 7557 1726882093.65605: Calling all_inventory to load vars for managed_node3 7557 1726882093.65608: Calling groups_inventory to load vars for managed_node3 7557 1726882093.65610: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.65621: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.65623: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.65625: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.66385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.67227: done with get_vars() 7557 1726882093.67242: done getting variables 7557 1726882093.67281: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882093.67359: variable 'profile' from source: include params 7557 1726882093.67362: variable 'interface' from source: play vars 7557 1726882093.67402: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:28:13 -0400 (0:00:00.036) 0:00:19.527 ****** 7557 1726882093.67430: entering _queue_task() for managed_node3/assert 7557 1726882093.67639: worker is 1 (out of 1 available) 7557 1726882093.67653: exiting _queue_task() for managed_node3/assert 7557 1726882093.67665: done queuing things up, now waiting for results queue to drain 7557 1726882093.67667: waiting for pending results... 7557 1726882093.67841: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7557 1726882093.67904: in run() - task 12673a56-9f93-ed48-b3a5-000000000ae0 7557 1726882093.67915: variable 'ansible_search_path' from source: unknown 7557 1726882093.67919: variable 'ansible_search_path' from source: unknown 7557 1726882093.67946: calling self._execute() 7557 1726882093.68028: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.68032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.68041: variable 'omit' from source: magic vars 7557 1726882093.68304: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.68313: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.68321: variable 'omit' from source: magic vars 7557 1726882093.68351: variable 'omit' from source: magic vars 7557 1726882093.68419: variable 'profile' from source: include params 7557 1726882093.68423: variable 'interface' from source: play vars 7557 1726882093.68471: variable 'interface' from source: play vars 7557 1726882093.68484: variable 'omit' from source: magic vars 7557 1726882093.68526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882093.68554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882093.68570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882093.68583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.68595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.68620: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882093.68623: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.68626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.68701: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882093.68708: Set connection var ansible_shell_executable to /bin/sh 7557 1726882093.68711: Set connection var ansible_shell_type to sh 7557 1726882093.68716: Set connection var ansible_pipelining to False 7557 1726882093.68719: Set connection var ansible_connection to ssh 7557 1726882093.68724: Set connection var ansible_timeout to 10 7557 1726882093.68740: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.68743: variable 'ansible_connection' from source: unknown 7557 1726882093.68746: variable 'ansible_module_compression' from source: unknown 7557 1726882093.68748: variable 'ansible_shell_type' from source: unknown 7557 1726882093.68750: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.68753: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.68755: variable 'ansible_pipelining' from source: unknown 7557 1726882093.68760: variable 'ansible_timeout' from source: unknown 7557 1726882093.68762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.68862: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882093.68871: variable 'omit' from source: magic vars 7557 1726882093.68875: starting attempt loop 7557 1726882093.68877: running the handler 7557 1726882093.68954: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7557 1726882093.68957: Evaluated conditional (lsr_net_profile_ansible_managed): True 7557 1726882093.68963: handler run complete 7557 1726882093.68974: attempt loop complete, returning result 7557 1726882093.68976: _execute() done 7557 1726882093.68979: dumping result to json 7557 1726882093.68981: done dumping result, returning 7557 1726882093.68990: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [12673a56-9f93-ed48-b3a5-000000000ae0] 7557 1726882093.68992: sending task result for task 12673a56-9f93-ed48-b3a5-000000000ae0 7557 1726882093.69070: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000ae0 7557 1726882093.69072: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882093.69150: no more pending results, returning what we have 7557 1726882093.69153: results queue empty 7557 1726882093.69154: checking for any_errors_fatal 7557 1726882093.69157: done checking for any_errors_fatal 7557 1726882093.69158: checking for max_fail_percentage 7557 1726882093.69160: done checking for max_fail_percentage 7557 1726882093.69160: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.69161: done checking to see if all hosts have failed 7557 1726882093.69162: getting the remaining hosts for this loop 7557 1726882093.69163: done getting the remaining hosts for this loop 7557 1726882093.69166: getting the next task for host managed_node3 7557 1726882093.69172: done getting next task for host managed_node3 7557 1726882093.69174: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7557 1726882093.69177: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.69180: getting variables 7557 1726882093.69181: in VariableManager get_vars() 7557 1726882093.69224: Calling all_inventory to load vars for managed_node3 7557 1726882093.69227: Calling groups_inventory to load vars for managed_node3 7557 1726882093.69229: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.69238: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.69240: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.69242: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.73637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.74473: done with get_vars() 7557 1726882093.74496: done getting variables 7557 1726882093.74535: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882093.74605: variable 'profile' from source: include params 7557 1726882093.74608: variable 'interface' from source: play vars 7557 1726882093.74650: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:28:13 -0400 (0:00:00.072) 0:00:19.599 ****** 7557 1726882093.74675: entering _queue_task() for managed_node3/assert 7557 1726882093.74927: worker is 1 (out of 1 available) 7557 1726882093.74940: exiting _queue_task() for managed_node3/assert 7557 1726882093.74955: done queuing things up, now waiting for results queue to drain 7557 1726882093.74957: waiting for pending results... 7557 1726882093.75137: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7557 1726882093.75222: in run() - task 12673a56-9f93-ed48-b3a5-000000000ae1 7557 1726882093.75231: variable 'ansible_search_path' from source: unknown 7557 1726882093.75234: variable 'ansible_search_path' from source: unknown 7557 1726882093.75264: calling self._execute() 7557 1726882093.75345: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.75350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.75359: variable 'omit' from source: magic vars 7557 1726882093.75638: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.75648: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.75654: variable 'omit' from source: magic vars 7557 1726882093.75684: variable 'omit' from source: magic vars 7557 1726882093.75759: variable 'profile' from source: include params 7557 1726882093.75763: variable 'interface' from source: play vars 7557 1726882093.75810: variable 'interface' from source: play vars 7557 1726882093.75826: variable 'omit' from source: magic vars 7557 1726882093.75862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882093.75888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882093.75908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882093.75922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.75931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.75959: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882093.75962: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.75965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.76037: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882093.76043: Set connection var ansible_shell_executable to /bin/sh 7557 1726882093.76047: Set connection var ansible_shell_type to sh 7557 1726882093.76050: Set connection var ansible_pipelining to False 7557 1726882093.76054: Set connection var ansible_connection to ssh 7557 1726882093.76057: Set connection var ansible_timeout to 10 7557 1726882093.76076: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.76080: variable 'ansible_connection' from source: unknown 7557 1726882093.76083: variable 'ansible_module_compression' from source: unknown 7557 1726882093.76085: variable 'ansible_shell_type' from source: unknown 7557 1726882093.76087: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.76090: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.76092: variable 'ansible_pipelining' from source: unknown 7557 1726882093.76096: variable 'ansible_timeout' from source: unknown 7557 1726882093.76102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.76204: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882093.76213: variable 'omit' from source: magic vars 7557 1726882093.76217: starting attempt loop 7557 1726882093.76220: running the handler 7557 1726882093.76298: variable 'lsr_net_profile_fingerprint' from source: set_fact 7557 1726882093.76305: Evaluated conditional (lsr_net_profile_fingerprint): True 7557 1726882093.76310: handler run complete 7557 1726882093.76321: attempt loop complete, returning result 7557 1726882093.76324: _execute() done 7557 1726882093.76327: dumping result to json 7557 1726882093.76329: done dumping result, returning 7557 1726882093.76335: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [12673a56-9f93-ed48-b3a5-000000000ae1] 7557 1726882093.76340: sending task result for task 12673a56-9f93-ed48-b3a5-000000000ae1 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882093.76469: no more pending results, returning what we have 7557 1726882093.76472: results queue empty 7557 1726882093.76473: checking for any_errors_fatal 7557 1726882093.76481: done checking for any_errors_fatal 7557 1726882093.76481: checking for max_fail_percentage 7557 1726882093.76483: done checking for max_fail_percentage 7557 1726882093.76484: checking to see if all hosts have failed and the running result is not ok 7557 1726882093.76484: done checking to see if all hosts have failed 7557 1726882093.76485: getting the remaining hosts for this loop 7557 1726882093.76487: done getting the remaining hosts for this loop 7557 1726882093.76489: getting the next task for host managed_node3 7557 1726882093.76498: done getting next task for host managed_node3 7557 1726882093.76503: ^ task is: TASK: Show ipv4 routes 7557 1726882093.76505: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882093.76508: getting variables 7557 1726882093.76510: in VariableManager get_vars() 7557 1726882093.76559: Calling all_inventory to load vars for managed_node3 7557 1726882093.76562: Calling groups_inventory to load vars for managed_node3 7557 1726882093.76564: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882093.76574: Calling all_plugins_play to load vars for managed_node3 7557 1726882093.76577: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882093.76579: Calling groups_plugins_play to load vars for managed_node3 7557 1726882093.77107: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000ae1 7557 1726882093.77111: WORKER PROCESS EXITING 7557 1726882093.77398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882093.78260: done with get_vars() 7557 1726882093.78276: done getting variables 7557 1726882093.78320: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:48 Friday 20 September 2024 21:28:13 -0400 (0:00:00.036) 0:00:19.636 ****** 7557 1726882093.78344: entering _queue_task() for managed_node3/command 7557 1726882093.78569: worker is 1 (out of 1 available) 7557 1726882093.78583: exiting _queue_task() for managed_node3/command 7557 1726882093.78598: done queuing things up, now waiting for results queue to drain 7557 1726882093.78600: waiting for pending results... 7557 1726882093.78774: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7557 1726882093.78844: in run() - task 12673a56-9f93-ed48-b3a5-00000000005d 7557 1726882093.78856: variable 'ansible_search_path' from source: unknown 7557 1726882093.78885: calling self._execute() 7557 1726882093.78967: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.78972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.78980: variable 'omit' from source: magic vars 7557 1726882093.79252: variable 'ansible_distribution_major_version' from source: facts 7557 1726882093.79262: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882093.79273: variable 'omit' from source: magic vars 7557 1726882093.79289: variable 'omit' from source: magic vars 7557 1726882093.79320: variable 'omit' from source: magic vars 7557 1726882093.79351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882093.79382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882093.79402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882093.79416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.79427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882093.79451: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882093.79455: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.79457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.79533: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882093.79539: Set connection var ansible_shell_executable to /bin/sh 7557 1726882093.79542: Set connection var ansible_shell_type to sh 7557 1726882093.79546: Set connection var ansible_pipelining to False 7557 1726882093.79549: Set connection var ansible_connection to ssh 7557 1726882093.79554: Set connection var ansible_timeout to 10 7557 1726882093.79570: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.79573: variable 'ansible_connection' from source: unknown 7557 1726882093.79575: variable 'ansible_module_compression' from source: unknown 7557 1726882093.79578: variable 'ansible_shell_type' from source: unknown 7557 1726882093.79580: variable 'ansible_shell_executable' from source: unknown 7557 1726882093.79585: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882093.79587: variable 'ansible_pipelining' from source: unknown 7557 1726882093.79589: variable 'ansible_timeout' from source: unknown 7557 1726882093.79591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882093.79692: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882093.79712: variable 'omit' from source: magic vars 7557 1726882093.79715: starting attempt loop 7557 1726882093.79717: running the handler 7557 1726882093.79726: _low_level_execute_command(): starting 7557 1726882093.79734: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882093.80249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.80253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.80256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.80259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.80317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.80320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.80323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.80377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.82119: stdout chunk (state=3): >>>/root <<< 7557 1726882093.82215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.82245: stderr chunk (state=3): >>><<< 7557 1726882093.82248: stdout chunk (state=3): >>><<< 7557 1726882093.82269: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882093.82282: _low_level_execute_command(): starting 7557 1726882093.82289: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526 `" && echo ansible-tmp-1726882093.8227005-8368-274926461391526="` echo /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526 `" ) && sleep 0' 7557 1726882093.82750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.82755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.82757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.82768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.82771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.82821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.82825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.82831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.82876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.84768: stdout chunk (state=3): >>>ansible-tmp-1726882093.8227005-8368-274926461391526=/root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526 <<< 7557 1726882093.84872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.84903: stderr chunk (state=3): >>><<< 7557 1726882093.84906: stdout chunk (state=3): >>><<< 7557 1726882093.84928: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882093.8227005-8368-274926461391526=/root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882093.84952: variable 'ansible_module_compression' from source: unknown 7557 1726882093.84995: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882093.85036: variable 'ansible_facts' from source: unknown 7557 1726882093.85086: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/AnsiballZ_command.py 7557 1726882093.85198: Sending initial data 7557 1726882093.85202: Sent initial data (154 bytes) 7557 1726882093.85652: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.85655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.85657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.85659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.85661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.85717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.85720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.85726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.85771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.87288: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7557 1726882093.87295: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882093.87332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882093.87380: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmppzco5kal /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/AnsiballZ_command.py <<< 7557 1726882093.87383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/AnsiballZ_command.py" <<< 7557 1726882093.87427: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmppzco5kal" to remote "/root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/AnsiballZ_command.py" <<< 7557 1726882093.87430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/AnsiballZ_command.py" <<< 7557 1726882093.87974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.88020: stderr chunk (state=3): >>><<< 7557 1726882093.88023: stdout chunk (state=3): >>><<< 7557 1726882093.88066: done transferring module to remote 7557 1726882093.88077: _low_level_execute_command(): starting 7557 1726882093.88081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/ /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/AnsiballZ_command.py && sleep 0' 7557 1726882093.88540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.88543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.88546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.88548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.88554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.88601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.88605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.88616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.88660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882093.90377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882093.90400: stderr chunk (state=3): >>><<< 7557 1726882093.90403: stdout chunk (state=3): >>><<< 7557 1726882093.90415: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882093.90418: _low_level_execute_command(): starting 7557 1726882093.90424: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/AnsiballZ_command.py && sleep 0' 7557 1726882093.90879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882093.90882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882093.90885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882093.90887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882093.90889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882093.90944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882093.90950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882093.90953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882093.91001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.06306: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:28:14.056976", "end": "2024-09-20 21:28:14.060436", "delta": "0:00:00.003460", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882094.07649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882094.07660: stdout chunk (state=3): >>><<< 7557 1726882094.07671: stderr chunk (state=3): >>><<< 7557 1726882094.07701: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:28:14.056976", "end": "2024-09-20 21:28:14.060436", "delta": "0:00:00.003460", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882094.07744: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882094.07832: _low_level_execute_command(): starting 7557 1726882094.07837: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882093.8227005-8368-274926461391526/ > /dev/null 2>&1 && sleep 0' 7557 1726882094.08383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882094.08404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882094.08418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882094.08437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882094.08455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882094.08552: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.08569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.08582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.08658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.10600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.10604: stderr chunk (state=3): >>><<< 7557 1726882094.10606: stdout chunk (state=3): >>><<< 7557 1726882094.10609: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.10611: handler run complete 7557 1726882094.10613: Evaluated conditional (False): False 7557 1726882094.10615: attempt loop complete, returning result 7557 1726882094.10617: _execute() done 7557 1726882094.10619: dumping result to json 7557 1726882094.10621: done dumping result, returning 7557 1726882094.10623: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [12673a56-9f93-ed48-b3a5-00000000005d] 7557 1726882094.10625: sending task result for task 12673a56-9f93-ed48-b3a5-00000000005d ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003460", "end": "2024-09-20 21:28:14.060436", "rc": 0, "start": "2024-09-20 21:28:14.056976" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 default via 203.0.113.1 dev veth0 proto static metric 65535 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 7557 1726882094.10791: no more pending results, returning what we have 7557 1726882094.10800: results queue empty 7557 1726882094.10801: checking for any_errors_fatal 7557 1726882094.10807: done checking for any_errors_fatal 7557 1726882094.10808: checking for max_fail_percentage 7557 1726882094.10811: done checking for max_fail_percentage 7557 1726882094.10811: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.10812: done checking to see if all hosts have failed 7557 1726882094.10813: getting the remaining hosts for this loop 7557 1726882094.10815: done getting the remaining hosts for this loop 7557 1726882094.10818: getting the next task for host managed_node3 7557 1726882094.10825: done getting next task for host managed_node3 7557 1726882094.10828: ^ task is: TASK: Assert default ipv4 route is present 7557 1726882094.10830: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.10834: getting variables 7557 1726882094.10836: in VariableManager get_vars() 7557 1726882094.10890: Calling all_inventory to load vars for managed_node3 7557 1726882094.11100: Calling groups_inventory to load vars for managed_node3 7557 1726882094.11111: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.11119: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000005d 7557 1726882094.11122: WORKER PROCESS EXITING 7557 1726882094.11131: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.11134: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.11137: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.12816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.14539: done with get_vars() 7557 1726882094.14563: done getting variables 7557 1726882094.14623: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is present] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:52 Friday 20 September 2024 21:28:14 -0400 (0:00:00.363) 0:00:19.999 ****** 7557 1726882094.14653: entering _queue_task() for managed_node3/assert 7557 1726882094.14963: worker is 1 (out of 1 available) 7557 1726882094.14975: exiting _queue_task() for managed_node3/assert 7557 1726882094.14987: done queuing things up, now waiting for results queue to drain 7557 1726882094.14988: waiting for pending results... 7557 1726882094.15261: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present 7557 1726882094.15359: in run() - task 12673a56-9f93-ed48-b3a5-00000000005e 7557 1726882094.15379: variable 'ansible_search_path' from source: unknown 7557 1726882094.15426: calling self._execute() 7557 1726882094.15537: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.15548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.15562: variable 'omit' from source: magic vars 7557 1726882094.15938: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.15960: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.15971: variable 'omit' from source: magic vars 7557 1726882094.15996: variable 'omit' from source: magic vars 7557 1726882094.16038: variable 'omit' from source: magic vars 7557 1726882094.16088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882094.16134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882094.16160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882094.16190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.16210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.16242: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882094.16251: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.16257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.16370: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882094.16383: Set connection var ansible_shell_executable to /bin/sh 7557 1726882094.16598: Set connection var ansible_shell_type to sh 7557 1726882094.16601: Set connection var ansible_pipelining to False 7557 1726882094.16603: Set connection var ansible_connection to ssh 7557 1726882094.16606: Set connection var ansible_timeout to 10 7557 1726882094.16608: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.16610: variable 'ansible_connection' from source: unknown 7557 1726882094.16612: variable 'ansible_module_compression' from source: unknown 7557 1726882094.16615: variable 'ansible_shell_type' from source: unknown 7557 1726882094.16617: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.16619: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.16621: variable 'ansible_pipelining' from source: unknown 7557 1726882094.16623: variable 'ansible_timeout' from source: unknown 7557 1726882094.16625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.16628: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882094.16630: variable 'omit' from source: magic vars 7557 1726882094.16633: starting attempt loop 7557 1726882094.16641: running the handler 7557 1726882094.16791: variable '__test_str' from source: task vars 7557 1726882094.16869: variable 'interface' from source: play vars 7557 1726882094.16885: variable 'ipv4_routes' from source: set_fact 7557 1726882094.16903: Evaluated conditional (__test_str in ipv4_routes.stdout): True 7557 1726882094.16916: handler run complete 7557 1726882094.16934: attempt loop complete, returning result 7557 1726882094.16940: _execute() done 7557 1726882094.16945: dumping result to json 7557 1726882094.16950: done dumping result, returning 7557 1726882094.16961: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present [12673a56-9f93-ed48-b3a5-00000000005e] 7557 1726882094.16972: sending task result for task 12673a56-9f93-ed48-b3a5-00000000005e 7557 1726882094.17066: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000005e ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882094.17122: no more pending results, returning what we have 7557 1726882094.17127: results queue empty 7557 1726882094.17128: checking for any_errors_fatal 7557 1726882094.17139: done checking for any_errors_fatal 7557 1726882094.17139: checking for max_fail_percentage 7557 1726882094.17141: done checking for max_fail_percentage 7557 1726882094.17142: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.17143: done checking to see if all hosts have failed 7557 1726882094.17144: getting the remaining hosts for this loop 7557 1726882094.17146: done getting the remaining hosts for this loop 7557 1726882094.17149: getting the next task for host managed_node3 7557 1726882094.17156: done getting next task for host managed_node3 7557 1726882094.17158: ^ task is: TASK: Get ipv6 routes 7557 1726882094.17160: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.17164: getting variables 7557 1726882094.17166: in VariableManager get_vars() 7557 1726882094.17226: Calling all_inventory to load vars for managed_node3 7557 1726882094.17229: Calling groups_inventory to load vars for managed_node3 7557 1726882094.17232: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.17244: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.17247: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.17250: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.17956: WORKER PROCESS EXITING 7557 1726882094.18889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.20520: done with get_vars() 7557 1726882094.20543: done getting variables 7557 1726882094.20606: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:57 Friday 20 September 2024 21:28:14 -0400 (0:00:00.059) 0:00:20.059 ****** 7557 1726882094.20635: entering _queue_task() for managed_node3/command 7557 1726882094.20973: worker is 1 (out of 1 available) 7557 1726882094.20985: exiting _queue_task() for managed_node3/command 7557 1726882094.21199: done queuing things up, now waiting for results queue to drain 7557 1726882094.21201: waiting for pending results... 7557 1726882094.21287: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7557 1726882094.21401: in run() - task 12673a56-9f93-ed48-b3a5-00000000005f 7557 1726882094.21427: variable 'ansible_search_path' from source: unknown 7557 1726882094.21471: calling self._execute() 7557 1726882094.21582: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.21596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.21612: variable 'omit' from source: magic vars 7557 1726882094.21997: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.22080: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.22083: variable 'omit' from source: magic vars 7557 1726882094.22086: variable 'omit' from source: magic vars 7557 1726882094.22092: variable 'omit' from source: magic vars 7557 1726882094.22139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882094.22185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882094.22215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882094.22238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.22257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.22297: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882094.22309: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.22398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.22434: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882094.22447: Set connection var ansible_shell_executable to /bin/sh 7557 1726882094.22456: Set connection var ansible_shell_type to sh 7557 1726882094.22465: Set connection var ansible_pipelining to False 7557 1726882094.22472: Set connection var ansible_connection to ssh 7557 1726882094.22484: Set connection var ansible_timeout to 10 7557 1726882094.22512: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.22524: variable 'ansible_connection' from source: unknown 7557 1726882094.22532: variable 'ansible_module_compression' from source: unknown 7557 1726882094.22540: variable 'ansible_shell_type' from source: unknown 7557 1726882094.22547: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.22555: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.22563: variable 'ansible_pipelining' from source: unknown 7557 1726882094.22570: variable 'ansible_timeout' from source: unknown 7557 1726882094.22578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.22722: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882094.22744: variable 'omit' from source: magic vars 7557 1726882094.22798: starting attempt loop 7557 1726882094.22801: running the handler 7557 1726882094.22803: _low_level_execute_command(): starting 7557 1726882094.22806: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882094.23528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882094.23545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882094.23559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882094.23614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.23679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.23700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.23731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.23814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.25411: stdout chunk (state=3): >>>/root <<< 7557 1726882094.25569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.25573: stdout chunk (state=3): >>><<< 7557 1726882094.25575: stderr chunk (state=3): >>><<< 7557 1726882094.25599: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.25702: _low_level_execute_command(): starting 7557 1726882094.25705: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336 `" && echo ansible-tmp-1726882094.2560942-8381-246174340000336="` echo /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336 `" ) && sleep 0' 7557 1726882094.26262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882094.26276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882094.26289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882094.26312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882094.26334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882094.26446: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.26474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.26555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.28386: stdout chunk (state=3): >>>ansible-tmp-1726882094.2560942-8381-246174340000336=/root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336 <<< 7557 1726882094.28541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.28545: stdout chunk (state=3): >>><<< 7557 1726882094.28547: stderr chunk (state=3): >>><<< 7557 1726882094.28700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882094.2560942-8381-246174340000336=/root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.28703: variable 'ansible_module_compression' from source: unknown 7557 1726882094.28706: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882094.28708: variable 'ansible_facts' from source: unknown 7557 1726882094.28789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/AnsiballZ_command.py 7557 1726882094.28953: Sending initial data 7557 1726882094.28960: Sent initial data (154 bytes) 7557 1726882094.29602: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882094.29715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.29737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.29760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.29862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.31457: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882094.31505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882094.31563: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpavuko8o4 /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/AnsiballZ_command.py <<< 7557 1726882094.31566: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/AnsiballZ_command.py" <<< 7557 1726882094.31673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpavuko8o4" to remote "/root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/AnsiballZ_command.py" <<< 7557 1726882094.32997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.33011: stdout chunk (state=3): >>><<< 7557 1726882094.33139: stderr chunk (state=3): >>><<< 7557 1726882094.33142: done transferring module to remote 7557 1726882094.33157: _low_level_execute_command(): starting 7557 1726882094.33310: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/ /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/AnsiballZ_command.py && sleep 0' 7557 1726882094.34302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.34741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.35154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.36604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.36799: stdout chunk (state=3): >>><<< 7557 1726882094.36804: stderr chunk (state=3): >>><<< 7557 1726882094.36807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.36809: _low_level_execute_command(): starting 7557 1726882094.36812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/AnsiballZ_command.py && sleep 0' 7557 1726882094.37170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882094.37184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882094.37211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882094.37232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882094.37263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.37276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882094.37375: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.37389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.37411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.37502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.52507: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:28:14.520185", "end": "2024-09-20 21:28:14.523478", "delta": "0:00:00.003293", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882094.53855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882094.53881: stderr chunk (state=3): >>><<< 7557 1726882094.53884: stdout chunk (state=3): >>><<< 7557 1726882094.53905: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:28:14.520185", "end": "2024-09-20 21:28:14.523478", "delta": "0:00:00.003293", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882094.53934: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882094.53941: _low_level_execute_command(): starting 7557 1726882094.53946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882094.2560942-8381-246174340000336/ > /dev/null 2>&1 && sleep 0' 7557 1726882094.54370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882094.54374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.54380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882094.54383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882094.54385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.54426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.54439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.54486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.56245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.56265: stderr chunk (state=3): >>><<< 7557 1726882094.56268: stdout chunk (state=3): >>><<< 7557 1726882094.56280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.56286: handler run complete 7557 1726882094.56307: Evaluated conditional (False): False 7557 1726882094.56316: attempt loop complete, returning result 7557 1726882094.56319: _execute() done 7557 1726882094.56321: dumping result to json 7557 1726882094.56326: done dumping result, returning 7557 1726882094.56333: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [12673a56-9f93-ed48-b3a5-00000000005f] 7557 1726882094.56338: sending task result for task 12673a56-9f93-ed48-b3a5-00000000005f 7557 1726882094.56439: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000005f 7557 1726882094.56442: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003293", "end": "2024-09-20 21:28:14.523478", "rc": 0, "start": "2024-09-20 21:28:14.520185" } STDOUT: 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 7557 1726882094.56536: no more pending results, returning what we have 7557 1726882094.56539: results queue empty 7557 1726882094.56540: checking for any_errors_fatal 7557 1726882094.56546: done checking for any_errors_fatal 7557 1726882094.56546: checking for max_fail_percentage 7557 1726882094.56548: done checking for max_fail_percentage 7557 1726882094.56549: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.56549: done checking to see if all hosts have failed 7557 1726882094.56550: getting the remaining hosts for this loop 7557 1726882094.56551: done getting the remaining hosts for this loop 7557 1726882094.56554: getting the next task for host managed_node3 7557 1726882094.56559: done getting next task for host managed_node3 7557 1726882094.56561: ^ task is: TASK: Assert default ipv6 route is present 7557 1726882094.56565: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.56568: getting variables 7557 1726882094.56569: in VariableManager get_vars() 7557 1726882094.56616: Calling all_inventory to load vars for managed_node3 7557 1726882094.56619: Calling groups_inventory to load vars for managed_node3 7557 1726882094.56621: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.56631: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.56633: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.56636: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.57418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.58265: done with get_vars() 7557 1726882094.58280: done getting variables 7557 1726882094.58327: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is present] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:61 Friday 20 September 2024 21:28:14 -0400 (0:00:00.377) 0:00:20.436 ****** 7557 1726882094.58347: entering _queue_task() for managed_node3/assert 7557 1726882094.58563: worker is 1 (out of 1 available) 7557 1726882094.58576: exiting _queue_task() for managed_node3/assert 7557 1726882094.58588: done queuing things up, now waiting for results queue to drain 7557 1726882094.58589: waiting for pending results... 7557 1726882094.58766: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present 7557 1726882094.58833: in run() - task 12673a56-9f93-ed48-b3a5-000000000060 7557 1726882094.58845: variable 'ansible_search_path' from source: unknown 7557 1726882094.58874: calling self._execute() 7557 1726882094.58961: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.58965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.58973: variable 'omit' from source: magic vars 7557 1726882094.59251: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.59269: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.59348: variable 'network_provider' from source: set_fact 7557 1726882094.59352: Evaluated conditional (network_provider == "nm"): True 7557 1726882094.59363: variable 'omit' from source: magic vars 7557 1726882094.59377: variable 'omit' from source: magic vars 7557 1726882094.59406: variable 'omit' from source: magic vars 7557 1726882094.59437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882094.59465: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882094.59482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882094.59499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.59509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.59531: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882094.59534: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.59537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.59612: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882094.59618: Set connection var ansible_shell_executable to /bin/sh 7557 1726882094.59621: Set connection var ansible_shell_type to sh 7557 1726882094.59626: Set connection var ansible_pipelining to False 7557 1726882094.59629: Set connection var ansible_connection to ssh 7557 1726882094.59634: Set connection var ansible_timeout to 10 7557 1726882094.59651: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.59655: variable 'ansible_connection' from source: unknown 7557 1726882094.59657: variable 'ansible_module_compression' from source: unknown 7557 1726882094.59660: variable 'ansible_shell_type' from source: unknown 7557 1726882094.59662: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.59664: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.59667: variable 'ansible_pipelining' from source: unknown 7557 1726882094.59670: variable 'ansible_timeout' from source: unknown 7557 1726882094.59674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.59774: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882094.59783: variable 'omit' from source: magic vars 7557 1726882094.59786: starting attempt loop 7557 1726882094.59790: running the handler 7557 1726882094.59887: variable '__test_str' from source: task vars 7557 1726882094.59941: variable 'interface' from source: play vars 7557 1726882094.59948: variable 'ipv6_route' from source: set_fact 7557 1726882094.59958: Evaluated conditional (__test_str in ipv6_route.stdout): True 7557 1726882094.59964: handler run complete 7557 1726882094.59975: attempt loop complete, returning result 7557 1726882094.59978: _execute() done 7557 1726882094.59980: dumping result to json 7557 1726882094.59983: done dumping result, returning 7557 1726882094.59989: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present [12673a56-9f93-ed48-b3a5-000000000060] 7557 1726882094.59998: sending task result for task 12673a56-9f93-ed48-b3a5-000000000060 7557 1726882094.60083: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000060 7557 1726882094.60085: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882094.60168: no more pending results, returning what we have 7557 1726882094.60171: results queue empty 7557 1726882094.60172: checking for any_errors_fatal 7557 1726882094.60179: done checking for any_errors_fatal 7557 1726882094.60180: checking for max_fail_percentage 7557 1726882094.60182: done checking for max_fail_percentage 7557 1726882094.60183: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.60184: done checking to see if all hosts have failed 7557 1726882094.60184: getting the remaining hosts for this loop 7557 1726882094.60186: done getting the remaining hosts for this loop 7557 1726882094.60189: getting the next task for host managed_node3 7557 1726882094.60197: done getting next task for host managed_node3 7557 1726882094.60199: ^ task is: TASK: TEARDOWN: remove profiles. 7557 1726882094.60201: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.60204: getting variables 7557 1726882094.60205: in VariableManager get_vars() 7557 1726882094.60248: Calling all_inventory to load vars for managed_node3 7557 1726882094.60251: Calling groups_inventory to load vars for managed_node3 7557 1726882094.60253: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.60262: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.60264: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.60266: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.61187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.62035: done with get_vars() 7557 1726882094.62053: done getting variables 7557 1726882094.62099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:67 Friday 20 September 2024 21:28:14 -0400 (0:00:00.037) 0:00:20.474 ****** 7557 1726882094.62120: entering _queue_task() for managed_node3/debug 7557 1726882094.62367: worker is 1 (out of 1 available) 7557 1726882094.62379: exiting _queue_task() for managed_node3/debug 7557 1726882094.62396: done queuing things up, now waiting for results queue to drain 7557 1726882094.62398: waiting for pending results... 7557 1726882094.62570: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7557 1726882094.62639: in run() - task 12673a56-9f93-ed48-b3a5-000000000061 7557 1726882094.62652: variable 'ansible_search_path' from source: unknown 7557 1726882094.62680: calling self._execute() 7557 1726882094.62762: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.62767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.62777: variable 'omit' from source: magic vars 7557 1726882094.63050: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.63061: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.63072: variable 'omit' from source: magic vars 7557 1726882094.63088: variable 'omit' from source: magic vars 7557 1726882094.63120: variable 'omit' from source: magic vars 7557 1726882094.63152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882094.63184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882094.63204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882094.63218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.63230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.63255: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882094.63258: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.63260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.63337: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882094.63344: Set connection var ansible_shell_executable to /bin/sh 7557 1726882094.63347: Set connection var ansible_shell_type to sh 7557 1726882094.63351: Set connection var ansible_pipelining to False 7557 1726882094.63353: Set connection var ansible_connection to ssh 7557 1726882094.63358: Set connection var ansible_timeout to 10 7557 1726882094.63374: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.63378: variable 'ansible_connection' from source: unknown 7557 1726882094.63383: variable 'ansible_module_compression' from source: unknown 7557 1726882094.63386: variable 'ansible_shell_type' from source: unknown 7557 1726882094.63388: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.63391: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.63394: variable 'ansible_pipelining' from source: unknown 7557 1726882094.63397: variable 'ansible_timeout' from source: unknown 7557 1726882094.63399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.63496: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882094.63509: variable 'omit' from source: magic vars 7557 1726882094.63513: starting attempt loop 7557 1726882094.63515: running the handler 7557 1726882094.63556: handler run complete 7557 1726882094.63568: attempt loop complete, returning result 7557 1726882094.63571: _execute() done 7557 1726882094.63574: dumping result to json 7557 1726882094.63576: done dumping result, returning 7557 1726882094.63582: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [12673a56-9f93-ed48-b3a5-000000000061] 7557 1726882094.63587: sending task result for task 12673a56-9f93-ed48-b3a5-000000000061 7557 1726882094.63668: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000061 7557 1726882094.63670: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7557 1726882094.63718: no more pending results, returning what we have 7557 1726882094.63722: results queue empty 7557 1726882094.63723: checking for any_errors_fatal 7557 1726882094.63729: done checking for any_errors_fatal 7557 1726882094.63730: checking for max_fail_percentage 7557 1726882094.63732: done checking for max_fail_percentage 7557 1726882094.63732: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.63733: done checking to see if all hosts have failed 7557 1726882094.63734: getting the remaining hosts for this loop 7557 1726882094.63735: done getting the remaining hosts for this loop 7557 1726882094.63738: getting the next task for host managed_node3 7557 1726882094.63745: done getting next task for host managed_node3 7557 1726882094.63750: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882094.63753: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.63771: getting variables 7557 1726882094.63773: in VariableManager get_vars() 7557 1726882094.63820: Calling all_inventory to load vars for managed_node3 7557 1726882094.63823: Calling groups_inventory to load vars for managed_node3 7557 1726882094.63825: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.63834: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.63837: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.63839: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.64600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.65452: done with get_vars() 7557 1726882094.65465: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:28:14 -0400 (0:00:00.034) 0:00:20.508 ****** 7557 1726882094.65534: entering _queue_task() for managed_node3/include_tasks 7557 1726882094.65744: worker is 1 (out of 1 available) 7557 1726882094.65757: exiting _queue_task() for managed_node3/include_tasks 7557 1726882094.65769: done queuing things up, now waiting for results queue to drain 7557 1726882094.65770: waiting for pending results... 7557 1726882094.65941: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882094.66035: in run() - task 12673a56-9f93-ed48-b3a5-000000000069 7557 1726882094.66048: variable 'ansible_search_path' from source: unknown 7557 1726882094.66052: variable 'ansible_search_path' from source: unknown 7557 1726882094.66078: calling self._execute() 7557 1726882094.66153: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.66156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.66166: variable 'omit' from source: magic vars 7557 1726882094.66440: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.66444: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.66451: _execute() done 7557 1726882094.66453: dumping result to json 7557 1726882094.66456: done dumping result, returning 7557 1726882094.66464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-ed48-b3a5-000000000069] 7557 1726882094.66468: sending task result for task 12673a56-9f93-ed48-b3a5-000000000069 7557 1726882094.66550: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000069 7557 1726882094.66553: WORKER PROCESS EXITING 7557 1726882094.66595: no more pending results, returning what we have 7557 1726882094.66601: in VariableManager get_vars() 7557 1726882094.66650: Calling all_inventory to load vars for managed_node3 7557 1726882094.66653: Calling groups_inventory to load vars for managed_node3 7557 1726882094.66656: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.66664: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.66667: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.66669: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.67506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.68340: done with get_vars() 7557 1726882094.68353: variable 'ansible_search_path' from source: unknown 7557 1726882094.68354: variable 'ansible_search_path' from source: unknown 7557 1726882094.68380: we have included files to process 7557 1726882094.68380: generating all_blocks data 7557 1726882094.68382: done generating all_blocks data 7557 1726882094.68385: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882094.68385: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882094.68387: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882094.68751: done processing included file 7557 1726882094.68752: iterating over new_blocks loaded from include file 7557 1726882094.68753: in VariableManager get_vars() 7557 1726882094.68771: done with get_vars() 7557 1726882094.68772: filtering new block on tags 7557 1726882094.68783: done filtering new block on tags 7557 1726882094.68784: in VariableManager get_vars() 7557 1726882094.68806: done with get_vars() 7557 1726882094.68807: filtering new block on tags 7557 1726882094.68820: done filtering new block on tags 7557 1726882094.68822: in VariableManager get_vars() 7557 1726882094.68838: done with get_vars() 7557 1726882094.68839: filtering new block on tags 7557 1726882094.68849: done filtering new block on tags 7557 1726882094.68850: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7557 1726882094.68854: extending task lists for all hosts with included blocks 7557 1726882094.69295: done extending task lists 7557 1726882094.69296: done processing included files 7557 1726882094.69297: results queue empty 7557 1726882094.69297: checking for any_errors_fatal 7557 1726882094.69299: done checking for any_errors_fatal 7557 1726882094.69300: checking for max_fail_percentage 7557 1726882094.69300: done checking for max_fail_percentage 7557 1726882094.69301: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.69301: done checking to see if all hosts have failed 7557 1726882094.69302: getting the remaining hosts for this loop 7557 1726882094.69302: done getting the remaining hosts for this loop 7557 1726882094.69304: getting the next task for host managed_node3 7557 1726882094.69306: done getting next task for host managed_node3 7557 1726882094.69308: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882094.69310: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.69316: getting variables 7557 1726882094.69316: in VariableManager get_vars() 7557 1726882094.69329: Calling all_inventory to load vars for managed_node3 7557 1726882094.69331: Calling groups_inventory to load vars for managed_node3 7557 1726882094.69333: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.69336: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.69338: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.69339: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.69962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.70848: done with get_vars() 7557 1726882094.70861: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:28:14 -0400 (0:00:00.053) 0:00:20.562 ****** 7557 1726882094.70912: entering _queue_task() for managed_node3/setup 7557 1726882094.71135: worker is 1 (out of 1 available) 7557 1726882094.71148: exiting _queue_task() for managed_node3/setup 7557 1726882094.71161: done queuing things up, now waiting for results queue to drain 7557 1726882094.71162: waiting for pending results... 7557 1726882094.71339: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882094.71442: in run() - task 12673a56-9f93-ed48-b3a5-000000000d46 7557 1726882094.71454: variable 'ansible_search_path' from source: unknown 7557 1726882094.71458: variable 'ansible_search_path' from source: unknown 7557 1726882094.71484: calling self._execute() 7557 1726882094.71563: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.71566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.71576: variable 'omit' from source: magic vars 7557 1726882094.71838: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.71848: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.71987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882094.73427: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882094.73479: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882094.73509: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882094.73535: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882094.73556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882094.73619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882094.73640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882094.73657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882094.73687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882094.73702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882094.73739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882094.73755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882094.73772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882094.73804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882094.73815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882094.73921: variable '__network_required_facts' from source: role '' defaults 7557 1726882094.73929: variable 'ansible_facts' from source: unknown 7557 1726882094.74363: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7557 1726882094.74366: when evaluation is False, skipping this task 7557 1726882094.74369: _execute() done 7557 1726882094.74371: dumping result to json 7557 1726882094.74374: done dumping result, returning 7557 1726882094.74380: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-ed48-b3a5-000000000d46] 7557 1726882094.74385: sending task result for task 12673a56-9f93-ed48-b3a5-000000000d46 7557 1726882094.74467: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000d46 7557 1726882094.74470: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882094.74517: no more pending results, returning what we have 7557 1726882094.74520: results queue empty 7557 1726882094.74521: checking for any_errors_fatal 7557 1726882094.74522: done checking for any_errors_fatal 7557 1726882094.74523: checking for max_fail_percentage 7557 1726882094.74525: done checking for max_fail_percentage 7557 1726882094.74525: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.74526: done checking to see if all hosts have failed 7557 1726882094.74527: getting the remaining hosts for this loop 7557 1726882094.74528: done getting the remaining hosts for this loop 7557 1726882094.74532: getting the next task for host managed_node3 7557 1726882094.74540: done getting next task for host managed_node3 7557 1726882094.74543: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882094.74547: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.74563: getting variables 7557 1726882094.74565: in VariableManager get_vars() 7557 1726882094.74613: Calling all_inventory to load vars for managed_node3 7557 1726882094.74616: Calling groups_inventory to load vars for managed_node3 7557 1726882094.74618: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.74627: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.74629: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.74632: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.75407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.76263: done with get_vars() 7557 1726882094.76277: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:28:14 -0400 (0:00:00.054) 0:00:20.616 ****** 7557 1726882094.76351: entering _queue_task() for managed_node3/stat 7557 1726882094.76558: worker is 1 (out of 1 available) 7557 1726882094.76571: exiting _queue_task() for managed_node3/stat 7557 1726882094.76584: done queuing things up, now waiting for results queue to drain 7557 1726882094.76586: waiting for pending results... 7557 1726882094.76762: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882094.76870: in run() - task 12673a56-9f93-ed48-b3a5-000000000d48 7557 1726882094.76882: variable 'ansible_search_path' from source: unknown 7557 1726882094.76885: variable 'ansible_search_path' from source: unknown 7557 1726882094.76918: calling self._execute() 7557 1726882094.76987: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.76990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.77002: variable 'omit' from source: magic vars 7557 1726882094.77268: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.77277: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.77390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882094.77579: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882094.77611: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882094.77637: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882094.77662: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882094.77726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882094.77744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882094.77761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882094.77778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882094.77842: variable '__network_is_ostree' from source: set_fact 7557 1726882094.77848: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882094.77851: when evaluation is False, skipping this task 7557 1726882094.77853: _execute() done 7557 1726882094.77856: dumping result to json 7557 1726882094.77858: done dumping result, returning 7557 1726882094.77865: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-ed48-b3a5-000000000d48] 7557 1726882094.77870: sending task result for task 12673a56-9f93-ed48-b3a5-000000000d48 7557 1726882094.77946: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000d48 7557 1726882094.77949: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882094.77997: no more pending results, returning what we have 7557 1726882094.78001: results queue empty 7557 1726882094.78002: checking for any_errors_fatal 7557 1726882094.78007: done checking for any_errors_fatal 7557 1726882094.78007: checking for max_fail_percentage 7557 1726882094.78009: done checking for max_fail_percentage 7557 1726882094.78010: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.78011: done checking to see if all hosts have failed 7557 1726882094.78011: getting the remaining hosts for this loop 7557 1726882094.78013: done getting the remaining hosts for this loop 7557 1726882094.78016: getting the next task for host managed_node3 7557 1726882094.78022: done getting next task for host managed_node3 7557 1726882094.78026: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882094.78029: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.78043: getting variables 7557 1726882094.78045: in VariableManager get_vars() 7557 1726882094.78084: Calling all_inventory to load vars for managed_node3 7557 1726882094.78086: Calling groups_inventory to load vars for managed_node3 7557 1726882094.78088: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.78105: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.78108: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.78111: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.78962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.79859: done with get_vars() 7557 1726882094.79872: done getting variables 7557 1726882094.79914: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:28:14 -0400 (0:00:00.035) 0:00:20.652 ****** 7557 1726882094.79937: entering _queue_task() for managed_node3/set_fact 7557 1726882094.80135: worker is 1 (out of 1 available) 7557 1726882094.80146: exiting _queue_task() for managed_node3/set_fact 7557 1726882094.80160: done queuing things up, now waiting for results queue to drain 7557 1726882094.80162: waiting for pending results... 7557 1726882094.80323: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882094.80423: in run() - task 12673a56-9f93-ed48-b3a5-000000000d49 7557 1726882094.80435: variable 'ansible_search_path' from source: unknown 7557 1726882094.80438: variable 'ansible_search_path' from source: unknown 7557 1726882094.80463: calling self._execute() 7557 1726882094.80538: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.80541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.80551: variable 'omit' from source: magic vars 7557 1726882094.80803: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.80812: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.81118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882094.81241: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882094.81287: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882094.81327: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882094.81366: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882094.81452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882094.81483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882094.81516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882094.81546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882094.81633: variable '__network_is_ostree' from source: set_fact 7557 1726882094.81647: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882094.81653: when evaluation is False, skipping this task 7557 1726882094.81660: _execute() done 7557 1726882094.81666: dumping result to json 7557 1726882094.81673: done dumping result, returning 7557 1726882094.81683: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-ed48-b3a5-000000000d49] 7557 1726882094.81692: sending task result for task 12673a56-9f93-ed48-b3a5-000000000d49 7557 1726882094.81999: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000d49 7557 1726882094.82003: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882094.82066: no more pending results, returning what we have 7557 1726882094.82070: results queue empty 7557 1726882094.82074: checking for any_errors_fatal 7557 1726882094.82079: done checking for any_errors_fatal 7557 1726882094.82080: checking for max_fail_percentage 7557 1726882094.82081: done checking for max_fail_percentage 7557 1726882094.82082: checking to see if all hosts have failed and the running result is not ok 7557 1726882094.82083: done checking to see if all hosts have failed 7557 1726882094.82083: getting the remaining hosts for this loop 7557 1726882094.82085: done getting the remaining hosts for this loop 7557 1726882094.82087: getting the next task for host managed_node3 7557 1726882094.82097: done getting next task for host managed_node3 7557 1726882094.82100: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882094.82104: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882094.82124: getting variables 7557 1726882094.82129: in VariableManager get_vars() 7557 1726882094.82163: Calling all_inventory to load vars for managed_node3 7557 1726882094.82165: Calling groups_inventory to load vars for managed_node3 7557 1726882094.82166: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882094.82172: Calling all_plugins_play to load vars for managed_node3 7557 1726882094.82174: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882094.82175: Calling groups_plugins_play to load vars for managed_node3 7557 1726882094.82902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882094.83747: done with get_vars() 7557 1726882094.83761: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:28:14 -0400 (0:00:00.038) 0:00:20.691 ****** 7557 1726882094.83825: entering _queue_task() for managed_node3/service_facts 7557 1726882094.84044: worker is 1 (out of 1 available) 7557 1726882094.84055: exiting _queue_task() for managed_node3/service_facts 7557 1726882094.84068: done queuing things up, now waiting for results queue to drain 7557 1726882094.84069: waiting for pending results... 7557 1726882094.84329: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882094.84469: in run() - task 12673a56-9f93-ed48-b3a5-000000000d4b 7557 1726882094.84487: variable 'ansible_search_path' from source: unknown 7557 1726882094.84496: variable 'ansible_search_path' from source: unknown 7557 1726882094.84538: calling self._execute() 7557 1726882094.84632: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.84642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.84654: variable 'omit' from source: magic vars 7557 1726882094.85005: variable 'ansible_distribution_major_version' from source: facts 7557 1726882094.85021: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882094.85030: variable 'omit' from source: magic vars 7557 1726882094.85109: variable 'omit' from source: magic vars 7557 1726882094.85147: variable 'omit' from source: magic vars 7557 1726882094.85191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882094.85231: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882094.85277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882094.85281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.85296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882094.85330: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882094.85498: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.85501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.85504: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882094.85506: Set connection var ansible_shell_executable to /bin/sh 7557 1726882094.85508: Set connection var ansible_shell_type to sh 7557 1726882094.85510: Set connection var ansible_pipelining to False 7557 1726882094.85512: Set connection var ansible_connection to ssh 7557 1726882094.85514: Set connection var ansible_timeout to 10 7557 1726882094.85516: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.85518: variable 'ansible_connection' from source: unknown 7557 1726882094.85521: variable 'ansible_module_compression' from source: unknown 7557 1726882094.85522: variable 'ansible_shell_type' from source: unknown 7557 1726882094.85524: variable 'ansible_shell_executable' from source: unknown 7557 1726882094.85526: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882094.85528: variable 'ansible_pipelining' from source: unknown 7557 1726882094.85533: variable 'ansible_timeout' from source: unknown 7557 1726882094.85542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882094.85728: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882094.85744: variable 'omit' from source: magic vars 7557 1726882094.85758: starting attempt loop 7557 1726882094.85765: running the handler 7557 1726882094.85781: _low_level_execute_command(): starting 7557 1726882094.85791: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882094.86527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882094.86610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.86634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.86648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.86669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.86747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.88450: stdout chunk (state=3): >>>/root <<< 7557 1726882094.88586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.88609: stdout chunk (state=3): >>><<< 7557 1726882094.88629: stderr chunk (state=3): >>><<< 7557 1726882094.88742: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.88746: _low_level_execute_command(): starting 7557 1726882094.88749: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662 `" && echo ansible-tmp-1726882094.8864808-8410-143522481438662="` echo /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662 `" ) && sleep 0' 7557 1726882094.89314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882094.89335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882094.89409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.89474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.89491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.89519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.89603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.91455: stdout chunk (state=3): >>>ansible-tmp-1726882094.8864808-8410-143522481438662=/root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662 <<< 7557 1726882094.91608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.91633: stdout chunk (state=3): >>><<< 7557 1726882094.91636: stderr chunk (state=3): >>><<< 7557 1726882094.91800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882094.8864808-8410-143522481438662=/root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.91804: variable 'ansible_module_compression' from source: unknown 7557 1726882094.91806: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7557 1726882094.91808: variable 'ansible_facts' from source: unknown 7557 1726882094.91905: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/AnsiballZ_service_facts.py 7557 1726882094.92051: Sending initial data 7557 1726882094.92151: Sent initial data (160 bytes) 7557 1726882094.92725: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882094.92819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.92853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.92869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882094.92891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.92978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.94503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882094.94544: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882094.94592: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpzb00koon /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/AnsiballZ_service_facts.py <<< 7557 1726882094.94606: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/AnsiballZ_service_facts.py" <<< 7557 1726882094.94638: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpzb00koon" to remote "/root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/AnsiballZ_service_facts.py" <<< 7557 1726882094.95203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.95241: stderr chunk (state=3): >>><<< 7557 1726882094.95244: stdout chunk (state=3): >>><<< 7557 1726882094.95306: done transferring module to remote 7557 1726882094.95314: _low_level_execute_command(): starting 7557 1726882094.95318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/ /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/AnsiballZ_service_facts.py && sleep 0' 7557 1726882094.95733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882094.95736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882094.95738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.95740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882094.95742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.95788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.95798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.95844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882094.97533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882094.97554: stderr chunk (state=3): >>><<< 7557 1726882094.97558: stdout chunk (state=3): >>><<< 7557 1726882094.97573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882094.97576: _low_level_execute_command(): starting 7557 1726882094.97579: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/AnsiballZ_service_facts.py && sleep 0' 7557 1726882094.97964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882094.97968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.97979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882094.98035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882094.98038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882094.98095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882096.45123: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 7557 1726882096.45141: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 7557 1726882096.45173: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service<<< 7557 1726882096.45185: stdout chunk (state=3): >>>": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7557 1726882096.46671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882096.46698: stderr chunk (state=3): >>><<< 7557 1726882096.46701: stdout chunk (state=3): >>><<< 7557 1726882096.46730: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882096.47409: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882096.47417: _low_level_execute_command(): starting 7557 1726882096.47423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882094.8864808-8410-143522481438662/ > /dev/null 2>&1 && sleep 0' 7557 1726882096.47885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.47888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.47890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.47902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882096.47908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.47948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882096.47951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882096.47953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882096.48014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882096.49784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882096.49815: stderr chunk (state=3): >>><<< 7557 1726882096.49818: stdout chunk (state=3): >>><<< 7557 1726882096.49835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882096.49842: handler run complete 7557 1726882096.49952: variable 'ansible_facts' from source: unknown 7557 1726882096.50054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882096.50315: variable 'ansible_facts' from source: unknown 7557 1726882096.50403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882096.50516: attempt loop complete, returning result 7557 1726882096.50520: _execute() done 7557 1726882096.50522: dumping result to json 7557 1726882096.50555: done dumping result, returning 7557 1726882096.50564: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-ed48-b3a5-000000000d4b] 7557 1726882096.50569: sending task result for task 12673a56-9f93-ed48-b3a5-000000000d4b 7557 1726882096.51269: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000d4b 7557 1726882096.51272: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882096.51327: no more pending results, returning what we have 7557 1726882096.51329: results queue empty 7557 1726882096.51330: checking for any_errors_fatal 7557 1726882096.51332: done checking for any_errors_fatal 7557 1726882096.51333: checking for max_fail_percentage 7557 1726882096.51334: done checking for max_fail_percentage 7557 1726882096.51334: checking to see if all hosts have failed and the running result is not ok 7557 1726882096.51335: done checking to see if all hosts have failed 7557 1726882096.51335: getting the remaining hosts for this loop 7557 1726882096.51336: done getting the remaining hosts for this loop 7557 1726882096.51338: getting the next task for host managed_node3 7557 1726882096.51342: done getting next task for host managed_node3 7557 1726882096.51344: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882096.51347: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882096.51354: getting variables 7557 1726882096.51355: in VariableManager get_vars() 7557 1726882096.51381: Calling all_inventory to load vars for managed_node3 7557 1726882096.51383: Calling groups_inventory to load vars for managed_node3 7557 1726882096.51384: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882096.51390: Calling all_plugins_play to load vars for managed_node3 7557 1726882096.51392: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882096.51397: Calling groups_plugins_play to load vars for managed_node3 7557 1726882096.52066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882096.52929: done with get_vars() 7557 1726882096.52948: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:28:16 -0400 (0:00:01.692) 0:00:22.383 ****** 7557 1726882096.53027: entering _queue_task() for managed_node3/package_facts 7557 1726882096.53280: worker is 1 (out of 1 available) 7557 1726882096.53298: exiting _queue_task() for managed_node3/package_facts 7557 1726882096.53312: done queuing things up, now waiting for results queue to drain 7557 1726882096.53313: waiting for pending results... 7557 1726882096.53490: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882096.53600: in run() - task 12673a56-9f93-ed48-b3a5-000000000d4c 7557 1726882096.53610: variable 'ansible_search_path' from source: unknown 7557 1726882096.53614: variable 'ansible_search_path' from source: unknown 7557 1726882096.53642: calling self._execute() 7557 1726882096.53721: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882096.53725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882096.53734: variable 'omit' from source: magic vars 7557 1726882096.54007: variable 'ansible_distribution_major_version' from source: facts 7557 1726882096.54017: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882096.54024: variable 'omit' from source: magic vars 7557 1726882096.54070: variable 'omit' from source: magic vars 7557 1726882096.54099: variable 'omit' from source: magic vars 7557 1726882096.54130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882096.54156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882096.54173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882096.54185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882096.54204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882096.54224: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882096.54227: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882096.54229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882096.54302: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882096.54311: Set connection var ansible_shell_executable to /bin/sh 7557 1726882096.54314: Set connection var ansible_shell_type to sh 7557 1726882096.54317: Set connection var ansible_pipelining to False 7557 1726882096.54319: Set connection var ansible_connection to ssh 7557 1726882096.54321: Set connection var ansible_timeout to 10 7557 1726882096.54338: variable 'ansible_shell_executable' from source: unknown 7557 1726882096.54341: variable 'ansible_connection' from source: unknown 7557 1726882096.54344: variable 'ansible_module_compression' from source: unknown 7557 1726882096.54346: variable 'ansible_shell_type' from source: unknown 7557 1726882096.54348: variable 'ansible_shell_executable' from source: unknown 7557 1726882096.54350: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882096.54355: variable 'ansible_pipelining' from source: unknown 7557 1726882096.54357: variable 'ansible_timeout' from source: unknown 7557 1726882096.54361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882096.54503: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882096.54511: variable 'omit' from source: magic vars 7557 1726882096.54515: starting attempt loop 7557 1726882096.54518: running the handler 7557 1726882096.54531: _low_level_execute_command(): starting 7557 1726882096.54539: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882096.55052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.55055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.55059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.55061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.55112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882096.55116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882096.55131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882096.55187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882096.56758: stdout chunk (state=3): >>>/root <<< 7557 1726882096.56855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882096.56882: stderr chunk (state=3): >>><<< 7557 1726882096.56885: stdout chunk (state=3): >>><<< 7557 1726882096.56910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882096.56921: _low_level_execute_command(): starting 7557 1726882096.56926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520 `" && echo ansible-tmp-1726882096.5690835-8469-149579959479520="` echo /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520 `" ) && sleep 0' 7557 1726882096.57355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.57358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.57367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882096.57372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882096.57374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.57413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882096.57417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882096.57471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882096.59304: stdout chunk (state=3): >>>ansible-tmp-1726882096.5690835-8469-149579959479520=/root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520 <<< 7557 1726882096.59414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882096.59438: stderr chunk (state=3): >>><<< 7557 1726882096.59441: stdout chunk (state=3): >>><<< 7557 1726882096.59454: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882096.5690835-8469-149579959479520=/root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882096.59497: variable 'ansible_module_compression' from source: unknown 7557 1726882096.59535: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7557 1726882096.59585: variable 'ansible_facts' from source: unknown 7557 1726882096.59707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/AnsiballZ_package_facts.py 7557 1726882096.59811: Sending initial data 7557 1726882096.59814: Sent initial data (160 bytes) 7557 1726882096.60267: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882096.60270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882096.60273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.60276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.60279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.60331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882096.60338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882096.60340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882096.60384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882096.61898: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882096.61937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882096.61982: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpcfhgwaq9 /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/AnsiballZ_package_facts.py <<< 7557 1726882096.61986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/AnsiballZ_package_facts.py" <<< 7557 1726882096.62027: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpcfhgwaq9" to remote "/root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/AnsiballZ_package_facts.py" <<< 7557 1726882096.62033: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/AnsiballZ_package_facts.py" <<< 7557 1726882096.63068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882096.63115: stderr chunk (state=3): >>><<< 7557 1726882096.63119: stdout chunk (state=3): >>><<< 7557 1726882096.63145: done transferring module to remote 7557 1726882096.63154: _low_level_execute_command(): starting 7557 1726882096.63159: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/ /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/AnsiballZ_package_facts.py && sleep 0' 7557 1726882096.63617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882096.63622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882096.63624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.63627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882096.63632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.63634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.63671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882096.63684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882096.63733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882096.65420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882096.65446: stderr chunk (state=3): >>><<< 7557 1726882096.65449: stdout chunk (state=3): >>><<< 7557 1726882096.65465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882096.65468: _low_level_execute_command(): starting 7557 1726882096.65474: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/AnsiballZ_package_facts.py && sleep 0' 7557 1726882096.65918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.65921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.65924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882096.65926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882096.65978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882096.65981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882096.65985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882096.66035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882097.09884: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 7557 1726882097.09922: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 7557 1726882097.09947: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 7557 1726882097.09976: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 7557 1726882097.09988: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 7557 1726882097.09999: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 7557 1726882097.10042: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 7557 1726882097.10056: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 7557 1726882097.10060: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 7557 1726882097.10082: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7557 1726882097.11812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882097.11848: stderr chunk (state=3): >>><<< 7557 1726882097.11851: stdout chunk (state=3): >>><<< 7557 1726882097.11890: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882097.13164: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882097.13179: _low_level_execute_command(): starting 7557 1726882097.13184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882096.5690835-8469-149579959479520/ > /dev/null 2>&1 && sleep 0' 7557 1726882097.13650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882097.13655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882097.13658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882097.13660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882097.13662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882097.13664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882097.13716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882097.13719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882097.13725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882097.13771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882097.15632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882097.15635: stdout chunk (state=3): >>><<< 7557 1726882097.15638: stderr chunk (state=3): >>><<< 7557 1726882097.15651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882097.15799: handler run complete 7557 1726882097.16275: variable 'ansible_facts' from source: unknown 7557 1726882097.16580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.17608: variable 'ansible_facts' from source: unknown 7557 1726882097.18000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.18584: attempt loop complete, returning result 7557 1726882097.18607: _execute() done 7557 1726882097.18613: dumping result to json 7557 1726882097.18816: done dumping result, returning 7557 1726882097.18830: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-ed48-b3a5-000000000d4c] 7557 1726882097.18840: sending task result for task 12673a56-9f93-ed48-b3a5-000000000d4c 7557 1726882097.21145: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000d4c 7557 1726882097.21149: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882097.21300: no more pending results, returning what we have 7557 1726882097.21305: results queue empty 7557 1726882097.21306: checking for any_errors_fatal 7557 1726882097.21310: done checking for any_errors_fatal 7557 1726882097.21311: checking for max_fail_percentage 7557 1726882097.21313: done checking for max_fail_percentage 7557 1726882097.21313: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.21315: done checking to see if all hosts have failed 7557 1726882097.21315: getting the remaining hosts for this loop 7557 1726882097.21316: done getting the remaining hosts for this loop 7557 1726882097.21320: getting the next task for host managed_node3 7557 1726882097.21326: done getting next task for host managed_node3 7557 1726882097.21330: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882097.21333: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.21344: getting variables 7557 1726882097.21345: in VariableManager get_vars() 7557 1726882097.21385: Calling all_inventory to load vars for managed_node3 7557 1726882097.21387: Calling groups_inventory to load vars for managed_node3 7557 1726882097.21390: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.21404: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.21407: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.21410: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.22624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.24200: done with get_vars() 7557 1726882097.24239: done getting variables 7557 1726882097.24305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:28:17 -0400 (0:00:00.713) 0:00:23.096 ****** 7557 1726882097.24352: entering _queue_task() for managed_node3/debug 7557 1726882097.24734: worker is 1 (out of 1 available) 7557 1726882097.24747: exiting _queue_task() for managed_node3/debug 7557 1726882097.24758: done queuing things up, now waiting for results queue to drain 7557 1726882097.24759: waiting for pending results... 7557 1726882097.25024: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882097.25230: in run() - task 12673a56-9f93-ed48-b3a5-00000000006a 7557 1726882097.25235: variable 'ansible_search_path' from source: unknown 7557 1726882097.25239: variable 'ansible_search_path' from source: unknown 7557 1726882097.25245: calling self._execute() 7557 1726882097.25362: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.25374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.25390: variable 'omit' from source: magic vars 7557 1726882097.25797: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.25881: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.25885: variable 'omit' from source: magic vars 7557 1726882097.25902: variable 'omit' from source: magic vars 7557 1726882097.26013: variable 'network_provider' from source: set_fact 7557 1726882097.26040: variable 'omit' from source: magic vars 7557 1726882097.26087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882097.26138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882097.26164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882097.26186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882097.26213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882097.26248: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882097.26298: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.26306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.26384: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882097.26400: Set connection var ansible_shell_executable to /bin/sh 7557 1726882097.26409: Set connection var ansible_shell_type to sh 7557 1726882097.26428: Set connection var ansible_pipelining to False 7557 1726882097.26436: Set connection var ansible_connection to ssh 7557 1726882097.26446: Set connection var ansible_timeout to 10 7557 1726882097.26498: variable 'ansible_shell_executable' from source: unknown 7557 1726882097.26501: variable 'ansible_connection' from source: unknown 7557 1726882097.26504: variable 'ansible_module_compression' from source: unknown 7557 1726882097.26506: variable 'ansible_shell_type' from source: unknown 7557 1726882097.26508: variable 'ansible_shell_executable' from source: unknown 7557 1726882097.26510: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.26512: variable 'ansible_pipelining' from source: unknown 7557 1726882097.26514: variable 'ansible_timeout' from source: unknown 7557 1726882097.26515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.26672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882097.26746: variable 'omit' from source: magic vars 7557 1726882097.26752: starting attempt loop 7557 1726882097.26755: running the handler 7557 1726882097.26763: handler run complete 7557 1726882097.26781: attempt loop complete, returning result 7557 1726882097.26789: _execute() done 7557 1726882097.26798: dumping result to json 7557 1726882097.26806: done dumping result, returning 7557 1726882097.26818: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-ed48-b3a5-00000000006a] 7557 1726882097.26830: sending task result for task 12673a56-9f93-ed48-b3a5-00000000006a ok: [managed_node3] => {} MSG: Using network provider: nm 7557 1726882097.27029: no more pending results, returning what we have 7557 1726882097.27033: results queue empty 7557 1726882097.27034: checking for any_errors_fatal 7557 1726882097.27045: done checking for any_errors_fatal 7557 1726882097.27045: checking for max_fail_percentage 7557 1726882097.27047: done checking for max_fail_percentage 7557 1726882097.27048: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.27049: done checking to see if all hosts have failed 7557 1726882097.27049: getting the remaining hosts for this loop 7557 1726882097.27051: done getting the remaining hosts for this loop 7557 1726882097.27054: getting the next task for host managed_node3 7557 1726882097.27062: done getting next task for host managed_node3 7557 1726882097.27066: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882097.27069: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.27081: getting variables 7557 1726882097.27083: in VariableManager get_vars() 7557 1726882097.27138: Calling all_inventory to load vars for managed_node3 7557 1726882097.27142: Calling groups_inventory to load vars for managed_node3 7557 1726882097.27144: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.27155: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.27158: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.27162: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.27908: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000006a 7557 1726882097.27912: WORKER PROCESS EXITING 7557 1726882097.28914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.30529: done with get_vars() 7557 1726882097.30557: done getting variables 7557 1726882097.30616: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:28:17 -0400 (0:00:00.062) 0:00:23.159 ****** 7557 1726882097.30650: entering _queue_task() for managed_node3/fail 7557 1726882097.31101: worker is 1 (out of 1 available) 7557 1726882097.31111: exiting _queue_task() for managed_node3/fail 7557 1726882097.31123: done queuing things up, now waiting for results queue to drain 7557 1726882097.31124: waiting for pending results... 7557 1726882097.31313: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882097.31474: in run() - task 12673a56-9f93-ed48-b3a5-00000000006b 7557 1726882097.31499: variable 'ansible_search_path' from source: unknown 7557 1726882097.31509: variable 'ansible_search_path' from source: unknown 7557 1726882097.31555: calling self._execute() 7557 1726882097.31665: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.31683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.31701: variable 'omit' from source: magic vars 7557 1726882097.32098: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.32121: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.32253: variable 'network_state' from source: role '' defaults 7557 1726882097.32268: Evaluated conditional (network_state != {}): False 7557 1726882097.32276: when evaluation is False, skipping this task 7557 1726882097.32284: _execute() done 7557 1726882097.32291: dumping result to json 7557 1726882097.32304: done dumping result, returning 7557 1726882097.32316: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-ed48-b3a5-00000000006b] 7557 1726882097.32332: sending task result for task 12673a56-9f93-ed48-b3a5-00000000006b 7557 1726882097.32505: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000006b 7557 1726882097.32509: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882097.32555: no more pending results, returning what we have 7557 1726882097.32560: results queue empty 7557 1726882097.32561: checking for any_errors_fatal 7557 1726882097.32567: done checking for any_errors_fatal 7557 1726882097.32568: checking for max_fail_percentage 7557 1726882097.32570: done checking for max_fail_percentage 7557 1726882097.32571: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.32571: done checking to see if all hosts have failed 7557 1726882097.32572: getting the remaining hosts for this loop 7557 1726882097.32574: done getting the remaining hosts for this loop 7557 1726882097.32578: getting the next task for host managed_node3 7557 1726882097.32583: done getting next task for host managed_node3 7557 1726882097.32587: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882097.32591: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.32614: getting variables 7557 1726882097.32616: in VariableManager get_vars() 7557 1726882097.32666: Calling all_inventory to load vars for managed_node3 7557 1726882097.32668: Calling groups_inventory to load vars for managed_node3 7557 1726882097.32671: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.32682: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.32685: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.32688: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.34233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.35808: done with get_vars() 7557 1726882097.35840: done getting variables 7557 1726882097.35904: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:28:17 -0400 (0:00:00.052) 0:00:23.212 ****** 7557 1726882097.35945: entering _queue_task() for managed_node3/fail 7557 1726882097.36389: worker is 1 (out of 1 available) 7557 1726882097.36405: exiting _queue_task() for managed_node3/fail 7557 1726882097.36416: done queuing things up, now waiting for results queue to drain 7557 1726882097.36418: waiting for pending results... 7557 1726882097.36639: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882097.36791: in run() - task 12673a56-9f93-ed48-b3a5-00000000006c 7557 1726882097.36900: variable 'ansible_search_path' from source: unknown 7557 1726882097.36905: variable 'ansible_search_path' from source: unknown 7557 1726882097.36908: calling self._execute() 7557 1726882097.36981: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.36995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.37011: variable 'omit' from source: magic vars 7557 1726882097.37413: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.37430: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.37573: variable 'network_state' from source: role '' defaults 7557 1726882097.37591: Evaluated conditional (network_state != {}): False 7557 1726882097.37602: when evaluation is False, skipping this task 7557 1726882097.37679: _execute() done 7557 1726882097.37683: dumping result to json 7557 1726882097.37685: done dumping result, returning 7557 1726882097.37688: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-ed48-b3a5-00000000006c] 7557 1726882097.37691: sending task result for task 12673a56-9f93-ed48-b3a5-00000000006c 7557 1726882097.37768: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000006c 7557 1726882097.37772: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882097.37836: no more pending results, returning what we have 7557 1726882097.37840: results queue empty 7557 1726882097.37841: checking for any_errors_fatal 7557 1726882097.37852: done checking for any_errors_fatal 7557 1726882097.37853: checking for max_fail_percentage 7557 1726882097.37855: done checking for max_fail_percentage 7557 1726882097.37856: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.37857: done checking to see if all hosts have failed 7557 1726882097.37858: getting the remaining hosts for this loop 7557 1726882097.37860: done getting the remaining hosts for this loop 7557 1726882097.37865: getting the next task for host managed_node3 7557 1726882097.37872: done getting next task for host managed_node3 7557 1726882097.37877: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882097.38096: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.38117: getting variables 7557 1726882097.38119: in VariableManager get_vars() 7557 1726882097.38169: Calling all_inventory to load vars for managed_node3 7557 1726882097.38173: Calling groups_inventory to load vars for managed_node3 7557 1726882097.38175: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.38186: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.38189: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.38192: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.46404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.49363: done with get_vars() 7557 1726882097.49392: done getting variables 7557 1726882097.49540: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:28:17 -0400 (0:00:00.136) 0:00:23.348 ****** 7557 1726882097.49568: entering _queue_task() for managed_node3/fail 7557 1726882097.49937: worker is 1 (out of 1 available) 7557 1726882097.49954: exiting _queue_task() for managed_node3/fail 7557 1726882097.49967: done queuing things up, now waiting for results queue to drain 7557 1726882097.49968: waiting for pending results... 7557 1726882097.50621: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882097.50933: in run() - task 12673a56-9f93-ed48-b3a5-00000000006d 7557 1726882097.50938: variable 'ansible_search_path' from source: unknown 7557 1726882097.50942: variable 'ansible_search_path' from source: unknown 7557 1726882097.50946: calling self._execute() 7557 1726882097.51229: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.51233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.51247: variable 'omit' from source: magic vars 7557 1726882097.52046: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.52059: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.52535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882097.54606: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882097.54697: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882097.54746: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882097.54784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882097.54828: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882097.54921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.54955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.54985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.55041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.55058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.55161: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.55182: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7557 1726882097.55309: variable 'ansible_distribution' from source: facts 7557 1726882097.55318: variable '__network_rh_distros' from source: role '' defaults 7557 1726882097.55331: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7557 1726882097.55581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.55620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.55651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.55701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.55720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.55770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.55801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.55830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.55872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.55890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.55941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.55969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.56000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.56043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.56199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.56377: variable 'network_connections' from source: task vars 7557 1726882097.56399: variable 'interface' from source: play vars 7557 1726882097.56467: variable 'interface' from source: play vars 7557 1726882097.56483: variable 'network_state' from source: role '' defaults 7557 1726882097.56556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882097.56742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882097.56784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882097.56824: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882097.56858: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882097.56906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882097.56933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882097.56973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.57006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882097.57035: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7557 1726882097.57043: when evaluation is False, skipping this task 7557 1726882097.57050: _execute() done 7557 1726882097.57057: dumping result to json 7557 1726882097.57063: done dumping result, returning 7557 1726882097.57074: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-ed48-b3a5-00000000006d] 7557 1726882097.57084: sending task result for task 12673a56-9f93-ed48-b3a5-00000000006d skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7557 1726882097.57304: no more pending results, returning what we have 7557 1726882097.57308: results queue empty 7557 1726882097.57309: checking for any_errors_fatal 7557 1726882097.57316: done checking for any_errors_fatal 7557 1726882097.57316: checking for max_fail_percentage 7557 1726882097.57318: done checking for max_fail_percentage 7557 1726882097.57319: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.57319: done checking to see if all hosts have failed 7557 1726882097.57320: getting the remaining hosts for this loop 7557 1726882097.57322: done getting the remaining hosts for this loop 7557 1726882097.57325: getting the next task for host managed_node3 7557 1726882097.57331: done getting next task for host managed_node3 7557 1726882097.57335: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882097.57337: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.57354: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000006d 7557 1726882097.57357: WORKER PROCESS EXITING 7557 1726882097.57585: getting variables 7557 1726882097.57587: in VariableManager get_vars() 7557 1726882097.57640: Calling all_inventory to load vars for managed_node3 7557 1726882097.57643: Calling groups_inventory to load vars for managed_node3 7557 1726882097.57645: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.57654: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.57656: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.57659: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.59032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.60706: done with get_vars() 7557 1726882097.60728: done getting variables 7557 1726882097.60782: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:28:17 -0400 (0:00:00.112) 0:00:23.461 ****** 7557 1726882097.60815: entering _queue_task() for managed_node3/dnf 7557 1726882097.61149: worker is 1 (out of 1 available) 7557 1726882097.61162: exiting _queue_task() for managed_node3/dnf 7557 1726882097.61175: done queuing things up, now waiting for results queue to drain 7557 1726882097.61176: waiting for pending results... 7557 1726882097.61486: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882097.61674: in run() - task 12673a56-9f93-ed48-b3a5-00000000006e 7557 1726882097.61708: variable 'ansible_search_path' from source: unknown 7557 1726882097.61729: variable 'ansible_search_path' from source: unknown 7557 1726882097.61779: calling self._execute() 7557 1726882097.61947: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.61951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.61954: variable 'omit' from source: magic vars 7557 1726882097.62511: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.62514: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.62597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882097.65127: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882097.65214: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882097.65262: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882097.65306: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882097.65338: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882097.65427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.65466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.65499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.65568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.65571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.65691: variable 'ansible_distribution' from source: facts 7557 1726882097.65707: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.65786: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7557 1726882097.65854: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882097.66002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.66032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.66061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.66112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.66130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.66173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.66205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.66237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.66327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.66330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.66341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.66368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.66399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.66445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.66462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.66632: variable 'network_connections' from source: task vars 7557 1726882097.66654: variable 'interface' from source: play vars 7557 1726882097.66723: variable 'interface' from source: play vars 7557 1726882097.66808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882097.67086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882097.67089: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882097.67096: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882097.67117: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882097.67163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882097.67199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882097.67241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.67270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882097.67327: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882097.67576: variable 'network_connections' from source: task vars 7557 1726882097.67588: variable 'interface' from source: play vars 7557 1726882097.67656: variable 'interface' from source: play vars 7557 1726882097.67683: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882097.67742: when evaluation is False, skipping this task 7557 1726882097.67744: _execute() done 7557 1726882097.67746: dumping result to json 7557 1726882097.67748: done dumping result, returning 7557 1726882097.67750: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-00000000006e] 7557 1726882097.67752: sending task result for task 12673a56-9f93-ed48-b3a5-00000000006e skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882097.67905: no more pending results, returning what we have 7557 1726882097.67910: results queue empty 7557 1726882097.67911: checking for any_errors_fatal 7557 1726882097.67920: done checking for any_errors_fatal 7557 1726882097.67921: checking for max_fail_percentage 7557 1726882097.67923: done checking for max_fail_percentage 7557 1726882097.67924: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.67925: done checking to see if all hosts have failed 7557 1726882097.67926: getting the remaining hosts for this loop 7557 1726882097.67927: done getting the remaining hosts for this loop 7557 1726882097.67931: getting the next task for host managed_node3 7557 1726882097.67938: done getting next task for host managed_node3 7557 1726882097.67943: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882097.67946: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.67966: getting variables 7557 1726882097.67967: in VariableManager get_vars() 7557 1726882097.68029: Calling all_inventory to load vars for managed_node3 7557 1726882097.68032: Calling groups_inventory to load vars for managed_node3 7557 1726882097.68035: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.68046: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.68049: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.68052: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.68845: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000006e 7557 1726882097.68849: WORKER PROCESS EXITING 7557 1726882097.69844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.71400: done with get_vars() 7557 1726882097.71438: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882097.71538: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:28:17 -0400 (0:00:00.107) 0:00:23.568 ****** 7557 1726882097.71579: entering _queue_task() for managed_node3/yum 7557 1726882097.71978: worker is 1 (out of 1 available) 7557 1726882097.71991: exiting _queue_task() for managed_node3/yum 7557 1726882097.72207: done queuing things up, now waiting for results queue to drain 7557 1726882097.72209: waiting for pending results... 7557 1726882097.72331: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882097.72496: in run() - task 12673a56-9f93-ed48-b3a5-00000000006f 7557 1726882097.72519: variable 'ansible_search_path' from source: unknown 7557 1726882097.72528: variable 'ansible_search_path' from source: unknown 7557 1726882097.72586: calling self._execute() 7557 1726882097.72711: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.72724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.72738: variable 'omit' from source: magic vars 7557 1726882097.73166: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.73184: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.73378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882097.75838: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882097.75927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882097.75971: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882097.76014: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882097.76050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882097.76137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.76176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.76210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.76256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.76362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.76391: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.76421: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7557 1726882097.76429: when evaluation is False, skipping this task 7557 1726882097.76436: _execute() done 7557 1726882097.76442: dumping result to json 7557 1726882097.76448: done dumping result, returning 7557 1726882097.76459: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-00000000006f] 7557 1726882097.76474: sending task result for task 12673a56-9f93-ed48-b3a5-00000000006f 7557 1726882097.76710: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000006f 7557 1726882097.76713: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7557 1726882097.76765: no more pending results, returning what we have 7557 1726882097.76769: results queue empty 7557 1726882097.76770: checking for any_errors_fatal 7557 1726882097.76777: done checking for any_errors_fatal 7557 1726882097.76778: checking for max_fail_percentage 7557 1726882097.76780: done checking for max_fail_percentage 7557 1726882097.76781: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.76782: done checking to see if all hosts have failed 7557 1726882097.76782: getting the remaining hosts for this loop 7557 1726882097.76784: done getting the remaining hosts for this loop 7557 1726882097.76787: getting the next task for host managed_node3 7557 1726882097.76798: done getting next task for host managed_node3 7557 1726882097.76802: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882097.76805: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.76825: getting variables 7557 1726882097.76827: in VariableManager get_vars() 7557 1726882097.76878: Calling all_inventory to load vars for managed_node3 7557 1726882097.76881: Calling groups_inventory to load vars for managed_node3 7557 1726882097.76883: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.77101: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.77106: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.77110: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.78732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.80264: done with get_vars() 7557 1726882097.80289: done getting variables 7557 1726882097.80347: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:28:17 -0400 (0:00:00.087) 0:00:23.656 ****** 7557 1726882097.80379: entering _queue_task() for managed_node3/fail 7557 1726882097.80707: worker is 1 (out of 1 available) 7557 1726882097.80719: exiting _queue_task() for managed_node3/fail 7557 1726882097.80730: done queuing things up, now waiting for results queue to drain 7557 1726882097.80731: waiting for pending results... 7557 1726882097.81117: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882097.81181: in run() - task 12673a56-9f93-ed48-b3a5-000000000070 7557 1726882097.81209: variable 'ansible_search_path' from source: unknown 7557 1726882097.81229: variable 'ansible_search_path' from source: unknown 7557 1726882097.81275: calling self._execute() 7557 1726882097.81391: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.81411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.81435: variable 'omit' from source: magic vars 7557 1726882097.81885: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.81923: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.82066: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882097.82314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882097.84802: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882097.84888: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882097.84923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882097.84955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882097.84972: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882097.85040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.85065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.85081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.85112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.85123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.85158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.85175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.85192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.85222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.85233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.85264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.85281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.85303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.85328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.85338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.85459: variable 'network_connections' from source: task vars 7557 1726882097.85470: variable 'interface' from source: play vars 7557 1726882097.85525: variable 'interface' from source: play vars 7557 1726882097.85575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882097.85686: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882097.85724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882097.85747: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882097.85768: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882097.85804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882097.85824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882097.85841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.85859: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882097.85897: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882097.86054: variable 'network_connections' from source: task vars 7557 1726882097.86058: variable 'interface' from source: play vars 7557 1726882097.86104: variable 'interface' from source: play vars 7557 1726882097.86124: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882097.86128: when evaluation is False, skipping this task 7557 1726882097.86131: _execute() done 7557 1726882097.86133: dumping result to json 7557 1726882097.86135: done dumping result, returning 7557 1726882097.86145: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-000000000070] 7557 1726882097.86147: sending task result for task 12673a56-9f93-ed48-b3a5-000000000070 7557 1726882097.86236: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000070 7557 1726882097.86238: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882097.86285: no more pending results, returning what we have 7557 1726882097.86289: results queue empty 7557 1726882097.86290: checking for any_errors_fatal 7557 1726882097.86298: done checking for any_errors_fatal 7557 1726882097.86299: checking for max_fail_percentage 7557 1726882097.86301: done checking for max_fail_percentage 7557 1726882097.86302: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.86303: done checking to see if all hosts have failed 7557 1726882097.86303: getting the remaining hosts for this loop 7557 1726882097.86305: done getting the remaining hosts for this loop 7557 1726882097.86308: getting the next task for host managed_node3 7557 1726882097.86315: done getting next task for host managed_node3 7557 1726882097.86319: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7557 1726882097.86321: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.86339: getting variables 7557 1726882097.86340: in VariableManager get_vars() 7557 1726882097.86402: Calling all_inventory to load vars for managed_node3 7557 1726882097.86406: Calling groups_inventory to load vars for managed_node3 7557 1726882097.86408: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.86417: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.86420: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.86422: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.87724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.88591: done with get_vars() 7557 1726882097.88609: done getting variables 7557 1726882097.88652: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:28:17 -0400 (0:00:00.082) 0:00:23.739 ****** 7557 1726882097.88677: entering _queue_task() for managed_node3/package 7557 1726882097.88914: worker is 1 (out of 1 available) 7557 1726882097.88927: exiting _queue_task() for managed_node3/package 7557 1726882097.88940: done queuing things up, now waiting for results queue to drain 7557 1726882097.88942: waiting for pending results... 7557 1726882097.89122: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7557 1726882097.89212: in run() - task 12673a56-9f93-ed48-b3a5-000000000071 7557 1726882097.89223: variable 'ansible_search_path' from source: unknown 7557 1726882097.89227: variable 'ansible_search_path' from source: unknown 7557 1726882097.89256: calling self._execute() 7557 1726882097.89339: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882097.89343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882097.89353: variable 'omit' from source: magic vars 7557 1726882097.89799: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.89802: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882097.89968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882097.90284: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882097.90340: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882097.90383: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882097.90470: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882097.90598: variable 'network_packages' from source: role '' defaults 7557 1726882097.90736: variable '__network_provider_setup' from source: role '' defaults 7557 1726882097.90765: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882097.90852: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882097.90860: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882097.90909: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882097.91039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882097.92798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882097.92805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882097.92809: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882097.92816: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882097.92846: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882097.92927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.92959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.92987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.93035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.93059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.93111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.93139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.93168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.93198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.93222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.93598: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882097.93601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.93603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.93606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.93608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.93616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.93707: variable 'ansible_python' from source: facts 7557 1726882097.93738: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882097.93824: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882097.93905: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882097.94027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.94056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.94083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.94128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.94146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.94196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882097.94233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882097.94262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.94306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882097.94326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882097.94474: variable 'network_connections' from source: task vars 7557 1726882097.94479: variable 'interface' from source: play vars 7557 1726882097.94558: variable 'interface' from source: play vars 7557 1726882097.94606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882097.94626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882097.94646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882097.94667: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882097.94710: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882097.94878: variable 'network_connections' from source: task vars 7557 1726882097.94881: variable 'interface' from source: play vars 7557 1726882097.94954: variable 'interface' from source: play vars 7557 1726882097.94977: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882097.95033: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882097.95221: variable 'network_connections' from source: task vars 7557 1726882097.95224: variable 'interface' from source: play vars 7557 1726882097.95271: variable 'interface' from source: play vars 7557 1726882097.95287: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882097.95342: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882097.95532: variable 'network_connections' from source: task vars 7557 1726882097.95535: variable 'interface' from source: play vars 7557 1726882097.95579: variable 'interface' from source: play vars 7557 1726882097.95621: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882097.95662: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882097.95667: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882097.95713: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882097.95843: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882097.96198: variable 'network_connections' from source: task vars 7557 1726882097.96201: variable 'interface' from source: play vars 7557 1726882097.96205: variable 'interface' from source: play vars 7557 1726882097.96217: variable 'ansible_distribution' from source: facts 7557 1726882097.96225: variable '__network_rh_distros' from source: role '' defaults 7557 1726882097.96235: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.96252: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882097.96421: variable 'ansible_distribution' from source: facts 7557 1726882097.96433: variable '__network_rh_distros' from source: role '' defaults 7557 1726882097.96443: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.96461: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882097.96634: variable 'ansible_distribution' from source: facts 7557 1726882097.96643: variable '__network_rh_distros' from source: role '' defaults 7557 1726882097.96652: variable 'ansible_distribution_major_version' from source: facts 7557 1726882097.96687: variable 'network_provider' from source: set_fact 7557 1726882097.96711: variable 'ansible_facts' from source: unknown 7557 1726882097.97287: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7557 1726882097.97291: when evaluation is False, skipping this task 7557 1726882097.97298: _execute() done 7557 1726882097.97300: dumping result to json 7557 1726882097.97302: done dumping result, returning 7557 1726882097.97308: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-ed48-b3a5-000000000071] 7557 1726882097.97313: sending task result for task 12673a56-9f93-ed48-b3a5-000000000071 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7557 1726882097.97459: no more pending results, returning what we have 7557 1726882097.97462: results queue empty 7557 1726882097.97463: checking for any_errors_fatal 7557 1726882097.97468: done checking for any_errors_fatal 7557 1726882097.97468: checking for max_fail_percentage 7557 1726882097.97470: done checking for max_fail_percentage 7557 1726882097.97471: checking to see if all hosts have failed and the running result is not ok 7557 1726882097.97472: done checking to see if all hosts have failed 7557 1726882097.97472: getting the remaining hosts for this loop 7557 1726882097.97474: done getting the remaining hosts for this loop 7557 1726882097.97477: getting the next task for host managed_node3 7557 1726882097.97484: done getting next task for host managed_node3 7557 1726882097.97487: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882097.97501: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882097.97513: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000071 7557 1726882097.97515: WORKER PROCESS EXITING 7557 1726882097.97528: getting variables 7557 1726882097.97529: in VariableManager get_vars() 7557 1726882097.97576: Calling all_inventory to load vars for managed_node3 7557 1726882097.97579: Calling groups_inventory to load vars for managed_node3 7557 1726882097.97581: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882097.97590: Calling all_plugins_play to load vars for managed_node3 7557 1726882097.97596: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882097.97599: Calling groups_plugins_play to load vars for managed_node3 7557 1726882097.98525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882097.99378: done with get_vars() 7557 1726882097.99398: done getting variables 7557 1726882097.99441: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:28:17 -0400 (0:00:00.107) 0:00:23.847 ****** 7557 1726882097.99466: entering _queue_task() for managed_node3/package 7557 1726882097.99707: worker is 1 (out of 1 available) 7557 1726882097.99720: exiting _queue_task() for managed_node3/package 7557 1726882097.99731: done queuing things up, now waiting for results queue to drain 7557 1726882097.99733: waiting for pending results... 7557 1726882097.99908: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882098.00000: in run() - task 12673a56-9f93-ed48-b3a5-000000000072 7557 1726882098.00010: variable 'ansible_search_path' from source: unknown 7557 1726882098.00014: variable 'ansible_search_path' from source: unknown 7557 1726882098.00043: calling self._execute() 7557 1726882098.00125: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.00129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.00139: variable 'omit' from source: magic vars 7557 1726882098.00410: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.00420: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.00502: variable 'network_state' from source: role '' defaults 7557 1726882098.00514: Evaluated conditional (network_state != {}): False 7557 1726882098.00517: when evaluation is False, skipping this task 7557 1726882098.00520: _execute() done 7557 1726882098.00522: dumping result to json 7557 1726882098.00525: done dumping result, returning 7557 1726882098.00530: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-ed48-b3a5-000000000072] 7557 1726882098.00536: sending task result for task 12673a56-9f93-ed48-b3a5-000000000072 7557 1726882098.00622: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000072 7557 1726882098.00625: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882098.00670: no more pending results, returning what we have 7557 1726882098.00674: results queue empty 7557 1726882098.00675: checking for any_errors_fatal 7557 1726882098.00683: done checking for any_errors_fatal 7557 1726882098.00684: checking for max_fail_percentage 7557 1726882098.00686: done checking for max_fail_percentage 7557 1726882098.00687: checking to see if all hosts have failed and the running result is not ok 7557 1726882098.00687: done checking to see if all hosts have failed 7557 1726882098.00688: getting the remaining hosts for this loop 7557 1726882098.00689: done getting the remaining hosts for this loop 7557 1726882098.00696: getting the next task for host managed_node3 7557 1726882098.00702: done getting next task for host managed_node3 7557 1726882098.00705: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882098.00708: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882098.00726: getting variables 7557 1726882098.00727: in VariableManager get_vars() 7557 1726882098.00769: Calling all_inventory to load vars for managed_node3 7557 1726882098.00772: Calling groups_inventory to load vars for managed_node3 7557 1726882098.00774: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882098.00782: Calling all_plugins_play to load vars for managed_node3 7557 1726882098.00784: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882098.00786: Calling groups_plugins_play to load vars for managed_node3 7557 1726882098.01542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882098.02411: done with get_vars() 7557 1726882098.02434: done getting variables 7557 1726882098.02477: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:28:18 -0400 (0:00:00.030) 0:00:23.878 ****** 7557 1726882098.02506: entering _queue_task() for managed_node3/package 7557 1726882098.02756: worker is 1 (out of 1 available) 7557 1726882098.02768: exiting _queue_task() for managed_node3/package 7557 1726882098.02780: done queuing things up, now waiting for results queue to drain 7557 1726882098.02782: waiting for pending results... 7557 1726882098.02971: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882098.03073: in run() - task 12673a56-9f93-ed48-b3a5-000000000073 7557 1726882098.03083: variable 'ansible_search_path' from source: unknown 7557 1726882098.03087: variable 'ansible_search_path' from source: unknown 7557 1726882098.03123: calling self._execute() 7557 1726882098.03198: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.03206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.03214: variable 'omit' from source: magic vars 7557 1726882098.03503: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.03514: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.03601: variable 'network_state' from source: role '' defaults 7557 1726882098.03609: Evaluated conditional (network_state != {}): False 7557 1726882098.03612: when evaluation is False, skipping this task 7557 1726882098.03615: _execute() done 7557 1726882098.03617: dumping result to json 7557 1726882098.03620: done dumping result, returning 7557 1726882098.03629: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-ed48-b3a5-000000000073] 7557 1726882098.03634: sending task result for task 12673a56-9f93-ed48-b3a5-000000000073 7557 1726882098.03722: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000073 7557 1726882098.03726: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882098.03771: no more pending results, returning what we have 7557 1726882098.03775: results queue empty 7557 1726882098.03776: checking for any_errors_fatal 7557 1726882098.03782: done checking for any_errors_fatal 7557 1726882098.03783: checking for max_fail_percentage 7557 1726882098.03785: done checking for max_fail_percentage 7557 1726882098.03786: checking to see if all hosts have failed and the running result is not ok 7557 1726882098.03786: done checking to see if all hosts have failed 7557 1726882098.03787: getting the remaining hosts for this loop 7557 1726882098.03789: done getting the remaining hosts for this loop 7557 1726882098.03792: getting the next task for host managed_node3 7557 1726882098.03800: done getting next task for host managed_node3 7557 1726882098.03804: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882098.03807: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882098.03826: getting variables 7557 1726882098.03827: in VariableManager get_vars() 7557 1726882098.03880: Calling all_inventory to load vars for managed_node3 7557 1726882098.03882: Calling groups_inventory to load vars for managed_node3 7557 1726882098.03884: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882098.03896: Calling all_plugins_play to load vars for managed_node3 7557 1726882098.03899: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882098.03902: Calling groups_plugins_play to load vars for managed_node3 7557 1726882098.04759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882098.05610: done with get_vars() 7557 1726882098.05627: done getting variables 7557 1726882098.05668: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:28:18 -0400 (0:00:00.031) 0:00:23.909 ****** 7557 1726882098.05696: entering _queue_task() for managed_node3/service 7557 1726882098.05914: worker is 1 (out of 1 available) 7557 1726882098.05928: exiting _queue_task() for managed_node3/service 7557 1726882098.05940: done queuing things up, now waiting for results queue to drain 7557 1726882098.05941: waiting for pending results... 7557 1726882098.06116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882098.06204: in run() - task 12673a56-9f93-ed48-b3a5-000000000074 7557 1726882098.06215: variable 'ansible_search_path' from source: unknown 7557 1726882098.06219: variable 'ansible_search_path' from source: unknown 7557 1726882098.06247: calling self._execute() 7557 1726882098.06333: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.06337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.06346: variable 'omit' from source: magic vars 7557 1726882098.06619: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.06628: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.06709: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882098.06842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882098.08321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882098.08376: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882098.08408: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882098.08434: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882098.08456: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882098.08520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.08540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.08561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.08587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.08601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.08633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.08648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.08665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.08699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.08709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.08737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.08753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.08769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.08803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.08814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.08928: variable 'network_connections' from source: task vars 7557 1726882098.08939: variable 'interface' from source: play vars 7557 1726882098.08987: variable 'interface' from source: play vars 7557 1726882098.09042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882098.09148: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882098.09183: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882098.09208: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882098.09234: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882098.09263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882098.09278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882098.09299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.09317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882098.09356: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882098.09506: variable 'network_connections' from source: task vars 7557 1726882098.09509: variable 'interface' from source: play vars 7557 1726882098.09557: variable 'interface' from source: play vars 7557 1726882098.09573: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882098.09577: when evaluation is False, skipping this task 7557 1726882098.09579: _execute() done 7557 1726882098.09581: dumping result to json 7557 1726882098.09584: done dumping result, returning 7557 1726882098.09591: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-000000000074] 7557 1726882098.09598: sending task result for task 12673a56-9f93-ed48-b3a5-000000000074 7557 1726882098.09682: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000074 7557 1726882098.09692: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882098.09741: no more pending results, returning what we have 7557 1726882098.09745: results queue empty 7557 1726882098.09746: checking for any_errors_fatal 7557 1726882098.09751: done checking for any_errors_fatal 7557 1726882098.09751: checking for max_fail_percentage 7557 1726882098.09753: done checking for max_fail_percentage 7557 1726882098.09754: checking to see if all hosts have failed and the running result is not ok 7557 1726882098.09754: done checking to see if all hosts have failed 7557 1726882098.09755: getting the remaining hosts for this loop 7557 1726882098.09756: done getting the remaining hosts for this loop 7557 1726882098.09760: getting the next task for host managed_node3 7557 1726882098.09766: done getting next task for host managed_node3 7557 1726882098.09769: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882098.09772: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882098.09789: getting variables 7557 1726882098.09791: in VariableManager get_vars() 7557 1726882098.09843: Calling all_inventory to load vars for managed_node3 7557 1726882098.09845: Calling groups_inventory to load vars for managed_node3 7557 1726882098.09847: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882098.09856: Calling all_plugins_play to load vars for managed_node3 7557 1726882098.09859: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882098.09861: Calling groups_plugins_play to load vars for managed_node3 7557 1726882098.10656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882098.11610: done with get_vars() 7557 1726882098.11627: done getting variables 7557 1726882098.11669: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:28:18 -0400 (0:00:00.059) 0:00:23.969 ****** 7557 1726882098.11691: entering _queue_task() for managed_node3/service 7557 1726882098.11919: worker is 1 (out of 1 available) 7557 1726882098.11932: exiting _queue_task() for managed_node3/service 7557 1726882098.11945: done queuing things up, now waiting for results queue to drain 7557 1726882098.11946: waiting for pending results... 7557 1726882098.12116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882098.12204: in run() - task 12673a56-9f93-ed48-b3a5-000000000075 7557 1726882098.12216: variable 'ansible_search_path' from source: unknown 7557 1726882098.12220: variable 'ansible_search_path' from source: unknown 7557 1726882098.12246: calling self._execute() 7557 1726882098.12327: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.12331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.12340: variable 'omit' from source: magic vars 7557 1726882098.12600: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.12613: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.12721: variable 'network_provider' from source: set_fact 7557 1726882098.12727: variable 'network_state' from source: role '' defaults 7557 1726882098.12731: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7557 1726882098.12737: variable 'omit' from source: magic vars 7557 1726882098.12775: variable 'omit' from source: magic vars 7557 1726882098.12799: variable 'network_service_name' from source: role '' defaults 7557 1726882098.12847: variable 'network_service_name' from source: role '' defaults 7557 1726882098.12920: variable '__network_provider_setup' from source: role '' defaults 7557 1726882098.12923: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882098.12970: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882098.12978: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882098.13025: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882098.13169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882098.14583: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882098.14637: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882098.14665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882098.14692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882098.14714: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882098.14770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.14797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.14814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.14841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.14852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.14882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.14903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.14922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.14946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.14956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.15105: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882098.15180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.15199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.15217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.15245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.15255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.15319: variable 'ansible_python' from source: facts 7557 1726882098.15339: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882098.15396: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882098.15449: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882098.15530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.15553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.15568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.15592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.15605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.15638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.15660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.15677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.15705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.15716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.15807: variable 'network_connections' from source: task vars 7557 1726882098.15814: variable 'interface' from source: play vars 7557 1726882098.15864: variable 'interface' from source: play vars 7557 1726882098.15938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882098.16068: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882098.16108: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882098.16138: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882098.16167: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882098.16214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882098.16235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882098.16257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.16279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882098.16316: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882098.16488: variable 'network_connections' from source: task vars 7557 1726882098.16496: variable 'interface' from source: play vars 7557 1726882098.16548: variable 'interface' from source: play vars 7557 1726882098.16572: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882098.16626: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882098.16807: variable 'network_connections' from source: task vars 7557 1726882098.16810: variable 'interface' from source: play vars 7557 1726882098.16859: variable 'interface' from source: play vars 7557 1726882098.16879: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882098.16933: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882098.17114: variable 'network_connections' from source: task vars 7557 1726882098.17117: variable 'interface' from source: play vars 7557 1726882098.17166: variable 'interface' from source: play vars 7557 1726882098.17205: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882098.17248: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882098.17253: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882098.17298: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882098.17430: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882098.17733: variable 'network_connections' from source: task vars 7557 1726882098.17738: variable 'interface' from source: play vars 7557 1726882098.17780: variable 'interface' from source: play vars 7557 1726882098.17787: variable 'ansible_distribution' from source: facts 7557 1726882098.17789: variable '__network_rh_distros' from source: role '' defaults 7557 1726882098.17798: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.17808: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882098.17920: variable 'ansible_distribution' from source: facts 7557 1726882098.17923: variable '__network_rh_distros' from source: role '' defaults 7557 1726882098.17928: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.17939: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882098.18051: variable 'ansible_distribution' from source: facts 7557 1726882098.18054: variable '__network_rh_distros' from source: role '' defaults 7557 1726882098.18059: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.18086: variable 'network_provider' from source: set_fact 7557 1726882098.18107: variable 'omit' from source: magic vars 7557 1726882098.18128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882098.18148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882098.18162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882098.18176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882098.18187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882098.18210: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882098.18213: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.18215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.18281: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882098.18297: Set connection var ansible_shell_executable to /bin/sh 7557 1726882098.18300: Set connection var ansible_shell_type to sh 7557 1726882098.18303: Set connection var ansible_pipelining to False 7557 1726882098.18305: Set connection var ansible_connection to ssh 7557 1726882098.18307: Set connection var ansible_timeout to 10 7557 1726882098.18324: variable 'ansible_shell_executable' from source: unknown 7557 1726882098.18327: variable 'ansible_connection' from source: unknown 7557 1726882098.18330: variable 'ansible_module_compression' from source: unknown 7557 1726882098.18332: variable 'ansible_shell_type' from source: unknown 7557 1726882098.18334: variable 'ansible_shell_executable' from source: unknown 7557 1726882098.18336: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.18340: variable 'ansible_pipelining' from source: unknown 7557 1726882098.18343: variable 'ansible_timeout' from source: unknown 7557 1726882098.18347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.18424: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882098.18430: variable 'omit' from source: magic vars 7557 1726882098.18437: starting attempt loop 7557 1726882098.18440: running the handler 7557 1726882098.18491: variable 'ansible_facts' from source: unknown 7557 1726882098.18947: _low_level_execute_command(): starting 7557 1726882098.18953: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882098.19460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882098.19463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.19466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882098.19469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882098.19471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.19526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882098.19529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882098.19535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.19588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882098.21258: stdout chunk (state=3): >>>/root <<< 7557 1726882098.21352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882098.21383: stderr chunk (state=3): >>><<< 7557 1726882098.21387: stdout chunk (state=3): >>><<< 7557 1726882098.21409: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882098.21422: _low_level_execute_command(): starting 7557 1726882098.21426: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098 `" && echo ansible-tmp-1726882098.2141054-8515-159900358242098="` echo /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098 `" ) && sleep 0' 7557 1726882098.21870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882098.21873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.21876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882098.21878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882098.21880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.21932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882098.21936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882098.21938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.21985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882098.23841: stdout chunk (state=3): >>>ansible-tmp-1726882098.2141054-8515-159900358242098=/root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098 <<< 7557 1726882098.23946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882098.23969: stderr chunk (state=3): >>><<< 7557 1726882098.23972: stdout chunk (state=3): >>><<< 7557 1726882098.23985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882098.2141054-8515-159900358242098=/root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882098.24015: variable 'ansible_module_compression' from source: unknown 7557 1726882098.24056: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7557 1726882098.24111: variable 'ansible_facts' from source: unknown 7557 1726882098.24245: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/AnsiballZ_systemd.py 7557 1726882098.24347: Sending initial data 7557 1726882098.24351: Sent initial data (154 bytes) 7557 1726882098.24801: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882098.24804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882098.24811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.24814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882098.24818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.24863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882098.24866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882098.24868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.24918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882098.26465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882098.26509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882098.26561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpiv4_pu7x /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/AnsiballZ_systemd.py <<< 7557 1726882098.26565: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/AnsiballZ_systemd.py" <<< 7557 1726882098.26606: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpiv4_pu7x" to remote "/root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/AnsiballZ_systemd.py" <<< 7557 1726882098.26610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/AnsiballZ_systemd.py" <<< 7557 1726882098.27666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882098.27712: stderr chunk (state=3): >>><<< 7557 1726882098.27715: stdout chunk (state=3): >>><<< 7557 1726882098.27749: done transferring module to remote 7557 1726882098.27758: _low_level_execute_command(): starting 7557 1726882098.27763: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/ /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/AnsiballZ_systemd.py && sleep 0' 7557 1726882098.28201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882098.28205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.28215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882098.28227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.28277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882098.28283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882098.28286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.28326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882098.30027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882098.30048: stderr chunk (state=3): >>><<< 7557 1726882098.30051: stdout chunk (state=3): >>><<< 7557 1726882098.30064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882098.30066: _low_level_execute_command(): starting 7557 1726882098.30071: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/AnsiballZ_systemd.py && sleep 0' 7557 1726882098.30511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882098.30515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882098.30517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.30520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882098.30522: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.30560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882098.30578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.30632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882098.59432: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9498624", "MemoryPeak": "10027008", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324878848", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "120087000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", <<< 7557 1726882098.59455: stdout chunk (state=3): >>>"MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service n<<< 7557 1726882098.59470: stdout chunk (state=3): >>>etwork.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7557 1726882098.61208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882098.61237: stderr chunk (state=3): >>><<< 7557 1726882098.61240: stdout chunk (state=3): >>><<< 7557 1726882098.61256: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9498624", "MemoryPeak": "10027008", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324878848", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "120087000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service network.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882098.61380: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882098.61394: _low_level_execute_command(): starting 7557 1726882098.61403: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882098.2141054-8515-159900358242098/ > /dev/null 2>&1 && sleep 0' 7557 1726882098.61849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882098.61853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.61855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882098.61857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882098.61859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.61900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882098.61912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882098.61921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.61961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882098.63718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882098.63742: stderr chunk (state=3): >>><<< 7557 1726882098.63745: stdout chunk (state=3): >>><<< 7557 1726882098.63758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882098.63764: handler run complete 7557 1726882098.63812: attempt loop complete, returning result 7557 1726882098.63815: _execute() done 7557 1726882098.63817: dumping result to json 7557 1726882098.63829: done dumping result, returning 7557 1726882098.63837: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-ed48-b3a5-000000000075] 7557 1726882098.63840: sending task result for task 12673a56-9f93-ed48-b3a5-000000000075 7557 1726882098.64066: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000075 7557 1726882098.64069: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882098.64124: no more pending results, returning what we have 7557 1726882098.64127: results queue empty 7557 1726882098.64128: checking for any_errors_fatal 7557 1726882098.64134: done checking for any_errors_fatal 7557 1726882098.64134: checking for max_fail_percentage 7557 1726882098.64136: done checking for max_fail_percentage 7557 1726882098.64137: checking to see if all hosts have failed and the running result is not ok 7557 1726882098.64138: done checking to see if all hosts have failed 7557 1726882098.64139: getting the remaining hosts for this loop 7557 1726882098.64140: done getting the remaining hosts for this loop 7557 1726882098.64143: getting the next task for host managed_node3 7557 1726882098.64148: done getting next task for host managed_node3 7557 1726882098.64151: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882098.64154: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882098.64163: getting variables 7557 1726882098.64165: in VariableManager get_vars() 7557 1726882098.64211: Calling all_inventory to load vars for managed_node3 7557 1726882098.64213: Calling groups_inventory to load vars for managed_node3 7557 1726882098.64215: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882098.64225: Calling all_plugins_play to load vars for managed_node3 7557 1726882098.64227: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882098.64229: Calling groups_plugins_play to load vars for managed_node3 7557 1726882098.64996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882098.65861: done with get_vars() 7557 1726882098.65878: done getting variables 7557 1726882098.65924: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:28:18 -0400 (0:00:00.542) 0:00:24.512 ****** 7557 1726882098.65948: entering _queue_task() for managed_node3/service 7557 1726882098.66182: worker is 1 (out of 1 available) 7557 1726882098.66198: exiting _queue_task() for managed_node3/service 7557 1726882098.66211: done queuing things up, now waiting for results queue to drain 7557 1726882098.66212: waiting for pending results... 7557 1726882098.66383: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882098.66477: in run() - task 12673a56-9f93-ed48-b3a5-000000000076 7557 1726882098.66488: variable 'ansible_search_path' from source: unknown 7557 1726882098.66492: variable 'ansible_search_path' from source: unknown 7557 1726882098.66526: calling self._execute() 7557 1726882098.66609: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.66613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.66621: variable 'omit' from source: magic vars 7557 1726882098.66892: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.66906: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.66988: variable 'network_provider' from source: set_fact 7557 1726882098.66994: Evaluated conditional (network_provider == "nm"): True 7557 1726882098.67056: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882098.67121: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882098.67237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882098.68882: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882098.68930: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882098.68959: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882098.68984: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882098.69008: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882098.69067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.69087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.69109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.69134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.69144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.69225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.69228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.69235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.69262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.69274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.69307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.69323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.69339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.69362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.69374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.69471: variable 'network_connections' from source: task vars 7557 1726882098.69482: variable 'interface' from source: play vars 7557 1726882098.69538: variable 'interface' from source: play vars 7557 1726882098.69587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882098.69713: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882098.69739: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882098.69760: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882098.69781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882098.69819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882098.69834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882098.69850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.69867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882098.69907: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882098.70063: variable 'network_connections' from source: task vars 7557 1726882098.70067: variable 'interface' from source: play vars 7557 1726882098.70113: variable 'interface' from source: play vars 7557 1726882098.70134: Evaluated conditional (__network_wpa_supplicant_required): False 7557 1726882098.70139: when evaluation is False, skipping this task 7557 1726882098.70142: _execute() done 7557 1726882098.70146: dumping result to json 7557 1726882098.70148: done dumping result, returning 7557 1726882098.70159: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-ed48-b3a5-000000000076] 7557 1726882098.70170: sending task result for task 12673a56-9f93-ed48-b3a5-000000000076 7557 1726882098.70245: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000076 7557 1726882098.70248: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7557 1726882098.70301: no more pending results, returning what we have 7557 1726882098.70305: results queue empty 7557 1726882098.70306: checking for any_errors_fatal 7557 1726882098.70324: done checking for any_errors_fatal 7557 1726882098.70325: checking for max_fail_percentage 7557 1726882098.70326: done checking for max_fail_percentage 7557 1726882098.70327: checking to see if all hosts have failed and the running result is not ok 7557 1726882098.70328: done checking to see if all hosts have failed 7557 1726882098.70328: getting the remaining hosts for this loop 7557 1726882098.70330: done getting the remaining hosts for this loop 7557 1726882098.70333: getting the next task for host managed_node3 7557 1726882098.70340: done getting next task for host managed_node3 7557 1726882098.70344: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882098.70347: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882098.70364: getting variables 7557 1726882098.70365: in VariableManager get_vars() 7557 1726882098.70414: Calling all_inventory to load vars for managed_node3 7557 1726882098.70417: Calling groups_inventory to load vars for managed_node3 7557 1726882098.70420: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882098.70430: Calling all_plugins_play to load vars for managed_node3 7557 1726882098.70432: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882098.70435: Calling groups_plugins_play to load vars for managed_node3 7557 1726882098.71554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882098.72768: done with get_vars() 7557 1726882098.72786: done getting variables 7557 1726882098.72834: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:28:18 -0400 (0:00:00.069) 0:00:24.581 ****** 7557 1726882098.72858: entering _queue_task() for managed_node3/service 7557 1726882098.73100: worker is 1 (out of 1 available) 7557 1726882098.73115: exiting _queue_task() for managed_node3/service 7557 1726882098.73127: done queuing things up, now waiting for results queue to drain 7557 1726882098.73128: waiting for pending results... 7557 1726882098.73309: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882098.73407: in run() - task 12673a56-9f93-ed48-b3a5-000000000077 7557 1726882098.73418: variable 'ansible_search_path' from source: unknown 7557 1726882098.73421: variable 'ansible_search_path' from source: unknown 7557 1726882098.73449: calling self._execute() 7557 1726882098.73532: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.73536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.73545: variable 'omit' from source: magic vars 7557 1726882098.73829: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.73839: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.73923: variable 'network_provider' from source: set_fact 7557 1726882098.73929: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882098.73931: when evaluation is False, skipping this task 7557 1726882098.73934: _execute() done 7557 1726882098.73937: dumping result to json 7557 1726882098.73939: done dumping result, returning 7557 1726882098.73946: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-ed48-b3a5-000000000077] 7557 1726882098.73952: sending task result for task 12673a56-9f93-ed48-b3a5-000000000077 7557 1726882098.74042: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000077 7557 1726882098.74045: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882098.74105: no more pending results, returning what we have 7557 1726882098.74109: results queue empty 7557 1726882098.74110: checking for any_errors_fatal 7557 1726882098.74119: done checking for any_errors_fatal 7557 1726882098.74119: checking for max_fail_percentage 7557 1726882098.74121: done checking for max_fail_percentage 7557 1726882098.74122: checking to see if all hosts have failed and the running result is not ok 7557 1726882098.74123: done checking to see if all hosts have failed 7557 1726882098.74123: getting the remaining hosts for this loop 7557 1726882098.74125: done getting the remaining hosts for this loop 7557 1726882098.74128: getting the next task for host managed_node3 7557 1726882098.74134: done getting next task for host managed_node3 7557 1726882098.74138: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882098.74142: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882098.74162: getting variables 7557 1726882098.74163: in VariableManager get_vars() 7557 1726882098.74224: Calling all_inventory to load vars for managed_node3 7557 1726882098.74227: Calling groups_inventory to load vars for managed_node3 7557 1726882098.74229: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882098.74240: Calling all_plugins_play to load vars for managed_node3 7557 1726882098.74242: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882098.74244: Calling groups_plugins_play to load vars for managed_node3 7557 1726882098.75955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882098.77701: done with get_vars() 7557 1726882098.77727: done getting variables 7557 1726882098.77787: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:28:18 -0400 (0:00:00.049) 0:00:24.631 ****** 7557 1726882098.77828: entering _queue_task() for managed_node3/copy 7557 1726882098.78186: worker is 1 (out of 1 available) 7557 1726882098.78203: exiting _queue_task() for managed_node3/copy 7557 1726882098.78215: done queuing things up, now waiting for results queue to drain 7557 1726882098.78217: waiting for pending results... 7557 1726882098.78617: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882098.78688: in run() - task 12673a56-9f93-ed48-b3a5-000000000078 7557 1726882098.78716: variable 'ansible_search_path' from source: unknown 7557 1726882098.78800: variable 'ansible_search_path' from source: unknown 7557 1726882098.78805: calling self._execute() 7557 1726882098.78888: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.78905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.78920: variable 'omit' from source: magic vars 7557 1726882098.79334: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.79352: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.79479: variable 'network_provider' from source: set_fact 7557 1726882098.79498: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882098.79511: when evaluation is False, skipping this task 7557 1726882098.79519: _execute() done 7557 1726882098.79526: dumping result to json 7557 1726882098.79614: done dumping result, returning 7557 1726882098.79618: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-ed48-b3a5-000000000078] 7557 1726882098.79622: sending task result for task 12673a56-9f93-ed48-b3a5-000000000078 7557 1726882098.79701: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000078 7557 1726882098.79704: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7557 1726882098.79764: no more pending results, returning what we have 7557 1726882098.79769: results queue empty 7557 1726882098.79771: checking for any_errors_fatal 7557 1726882098.79776: done checking for any_errors_fatal 7557 1726882098.79777: checking for max_fail_percentage 7557 1726882098.79779: done checking for max_fail_percentage 7557 1726882098.79780: checking to see if all hosts have failed and the running result is not ok 7557 1726882098.79781: done checking to see if all hosts have failed 7557 1726882098.79782: getting the remaining hosts for this loop 7557 1726882098.79783: done getting the remaining hosts for this loop 7557 1726882098.79787: getting the next task for host managed_node3 7557 1726882098.79798: done getting next task for host managed_node3 7557 1726882098.79802: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882098.79807: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882098.79828: getting variables 7557 1726882098.79830: in VariableManager get_vars() 7557 1726882098.79884: Calling all_inventory to load vars for managed_node3 7557 1726882098.79887: Calling groups_inventory to load vars for managed_node3 7557 1726882098.79890: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882098.80007: Calling all_plugins_play to load vars for managed_node3 7557 1726882098.80012: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882098.80015: Calling groups_plugins_play to load vars for managed_node3 7557 1726882098.81515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882098.83041: done with get_vars() 7557 1726882098.83068: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:28:18 -0400 (0:00:00.053) 0:00:24.684 ****** 7557 1726882098.83159: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882098.83490: worker is 1 (out of 1 available) 7557 1726882098.83507: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882098.83521: done queuing things up, now waiting for results queue to drain 7557 1726882098.83522: waiting for pending results... 7557 1726882098.83814: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882098.83951: in run() - task 12673a56-9f93-ed48-b3a5-000000000079 7557 1726882098.83971: variable 'ansible_search_path' from source: unknown 7557 1726882098.83979: variable 'ansible_search_path' from source: unknown 7557 1726882098.84100: calling self._execute() 7557 1726882098.84145: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.84158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.84173: variable 'omit' from source: magic vars 7557 1726882098.84531: variable 'ansible_distribution_major_version' from source: facts 7557 1726882098.84548: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882098.84561: variable 'omit' from source: magic vars 7557 1726882098.84621: variable 'omit' from source: magic vars 7557 1726882098.84786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882098.86977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882098.87055: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882098.87100: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882098.87157: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882098.87171: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882098.87258: variable 'network_provider' from source: set_fact 7557 1726882098.87408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882098.87500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882098.87503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882098.87539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882098.87559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882098.87644: variable 'omit' from source: magic vars 7557 1726882098.87763: variable 'omit' from source: magic vars 7557 1726882098.87864: variable 'network_connections' from source: task vars 7557 1726882098.87879: variable 'interface' from source: play vars 7557 1726882098.87947: variable 'interface' from source: play vars 7557 1726882098.88300: variable 'omit' from source: magic vars 7557 1726882098.88304: variable '__lsr_ansible_managed' from source: task vars 7557 1726882098.88306: variable '__lsr_ansible_managed' from source: task vars 7557 1726882098.88826: Loaded config def from plugin (lookup/template) 7557 1726882098.88837: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7557 1726882098.88875: File lookup term: get_ansible_managed.j2 7557 1726882098.88883: variable 'ansible_search_path' from source: unknown 7557 1726882098.88897: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7557 1726882098.88916: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7557 1726882098.88940: variable 'ansible_search_path' from source: unknown 7557 1726882098.94862: variable 'ansible_managed' from source: unknown 7557 1726882098.95008: variable 'omit' from source: magic vars 7557 1726882098.95046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882098.95078: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882098.95106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882098.95126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882098.95144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882098.95174: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882098.95182: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.95189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.95300: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882098.95313: Set connection var ansible_shell_executable to /bin/sh 7557 1726882098.95320: Set connection var ansible_shell_type to sh 7557 1726882098.95328: Set connection var ansible_pipelining to False 7557 1726882098.95334: Set connection var ansible_connection to ssh 7557 1726882098.95343: Set connection var ansible_timeout to 10 7557 1726882098.95374: variable 'ansible_shell_executable' from source: unknown 7557 1726882098.95381: variable 'ansible_connection' from source: unknown 7557 1726882098.95467: variable 'ansible_module_compression' from source: unknown 7557 1726882098.95470: variable 'ansible_shell_type' from source: unknown 7557 1726882098.95473: variable 'ansible_shell_executable' from source: unknown 7557 1726882098.95475: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882098.95477: variable 'ansible_pipelining' from source: unknown 7557 1726882098.95479: variable 'ansible_timeout' from source: unknown 7557 1726882098.95481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882098.95561: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882098.95586: variable 'omit' from source: magic vars 7557 1726882098.95604: starting attempt loop 7557 1726882098.95612: running the handler 7557 1726882098.95630: _low_level_execute_command(): starting 7557 1726882098.95641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882098.96365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.96385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882098.96403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.96471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882098.98141: stdout chunk (state=3): >>>/root <<< 7557 1726882098.98279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882098.98397: stdout chunk (state=3): >>><<< 7557 1726882098.98401: stderr chunk (state=3): >>><<< 7557 1726882098.98404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882098.98406: _low_level_execute_command(): starting 7557 1726882098.98409: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676 `" && echo ansible-tmp-1726882098.9832883-8535-27802346759676="` echo /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676 `" ) && sleep 0' 7557 1726882098.98980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882098.98999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882098.99015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882098.99034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882098.99053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882098.99066: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882098.99081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882098.99109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882098.99198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882098.99223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882098.99304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.01189: stdout chunk (state=3): >>>ansible-tmp-1726882098.9832883-8535-27802346759676=/root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676 <<< 7557 1726882099.01320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.01331: stdout chunk (state=3): >>><<< 7557 1726882099.01361: stderr chunk (state=3): >>><<< 7557 1726882099.01383: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882098.9832883-8535-27802346759676=/root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882099.01902: variable 'ansible_module_compression' from source: unknown 7557 1726882099.01905: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7557 1726882099.01907: variable 'ansible_facts' from source: unknown 7557 1726882099.01974: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/AnsiballZ_network_connections.py 7557 1726882099.02201: Sending initial data 7557 1726882099.02210: Sent initial data (165 bytes) 7557 1726882099.02711: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882099.02724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.02738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882099.02806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.02857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882099.02873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882099.02898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.02970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.04540: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882099.04606: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882099.04685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpln183bm0 /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/AnsiballZ_network_connections.py <<< 7557 1726882099.04699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/AnsiballZ_network_connections.py" <<< 7557 1726882099.04763: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpln183bm0" to remote "/root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/AnsiballZ_network_connections.py" <<< 7557 1726882099.06006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.06046: stderr chunk (state=3): >>><<< 7557 1726882099.06058: stdout chunk (state=3): >>><<< 7557 1726882099.06103: done transferring module to remote 7557 1726882099.06115: _low_level_execute_command(): starting 7557 1726882099.06192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/ /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/AnsiballZ_network_connections.py && sleep 0' 7557 1726882099.06799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882099.06812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.06828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882099.06870: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.06933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882099.06952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882099.06989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.07035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.08901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.08905: stderr chunk (state=3): >>><<< 7557 1726882099.08907: stdout chunk (state=3): >>><<< 7557 1726882099.08909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882099.08912: _low_level_execute_command(): starting 7557 1726882099.08914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/AnsiballZ_network_connections.py && sleep 0' 7557 1726882099.09415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882099.09424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.09435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882099.09455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882099.09467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882099.09473: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882099.09483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.09500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882099.09549: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882099.09552: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882099.09555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.09557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882099.09559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882099.09561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882099.09563: stderr chunk (state=3): >>>debug2: match found <<< 7557 1726882099.09610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.09640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882099.09654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882099.09692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.09749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.42441: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_40xy3te4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_40xy3te4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/24b01189-d26a-4e67-9260-0cb7eb810428: error=unknown <<< 7557 1726882099.42615: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7557 1726882099.44366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882099.44397: stderr chunk (state=3): >>><<< 7557 1726882099.44401: stdout chunk (state=3): >>><<< 7557 1726882099.44423: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_40xy3te4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_40xy3te4/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/24b01189-d26a-4e67-9260-0cb7eb810428: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882099.44451: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882099.44467: _low_level_execute_command(): starting 7557 1726882099.44471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882098.9832883-8535-27802346759676/ > /dev/null 2>&1 && sleep 0' 7557 1726882099.44924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.44927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882099.44930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.44932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.44934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882099.44936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.44986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882099.44995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882099.44998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.45037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.46850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.46878: stderr chunk (state=3): >>><<< 7557 1726882099.46882: stdout chunk (state=3): >>><<< 7557 1726882099.46902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882099.46913: handler run complete 7557 1726882099.46930: attempt loop complete, returning result 7557 1726882099.46937: _execute() done 7557 1726882099.46940: dumping result to json 7557 1726882099.46944: done dumping result, returning 7557 1726882099.46952: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-ed48-b3a5-000000000079] 7557 1726882099.46957: sending task result for task 12673a56-9f93-ed48-b3a5-000000000079 7557 1726882099.47052: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000079 7557 1726882099.47055: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7557 1726882099.47148: no more pending results, returning what we have 7557 1726882099.47152: results queue empty 7557 1726882099.47153: checking for any_errors_fatal 7557 1726882099.47159: done checking for any_errors_fatal 7557 1726882099.47160: checking for max_fail_percentage 7557 1726882099.47161: done checking for max_fail_percentage 7557 1726882099.47162: checking to see if all hosts have failed and the running result is not ok 7557 1726882099.47163: done checking to see if all hosts have failed 7557 1726882099.47168: getting the remaining hosts for this loop 7557 1726882099.47170: done getting the remaining hosts for this loop 7557 1726882099.47174: getting the next task for host managed_node3 7557 1726882099.47179: done getting next task for host managed_node3 7557 1726882099.47183: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882099.47185: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882099.47199: getting variables 7557 1726882099.47200: in VariableManager get_vars() 7557 1726882099.47247: Calling all_inventory to load vars for managed_node3 7557 1726882099.47250: Calling groups_inventory to load vars for managed_node3 7557 1726882099.47252: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882099.47262: Calling all_plugins_play to load vars for managed_node3 7557 1726882099.47264: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882099.47267: Calling groups_plugins_play to load vars for managed_node3 7557 1726882099.48228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882099.49859: done with get_vars() 7557 1726882099.49887: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:28:19 -0400 (0:00:00.668) 0:00:25.352 ****** 7557 1726882099.49974: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882099.50301: worker is 1 (out of 1 available) 7557 1726882099.50313: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882099.50325: done queuing things up, now waiting for results queue to drain 7557 1726882099.50327: waiting for pending results... 7557 1726882099.50784: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882099.50844: in run() - task 12673a56-9f93-ed48-b3a5-00000000007a 7557 1726882099.50866: variable 'ansible_search_path' from source: unknown 7557 1726882099.50901: variable 'ansible_search_path' from source: unknown 7557 1726882099.50928: calling self._execute() 7557 1726882099.51078: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.51145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.51150: variable 'omit' from source: magic vars 7557 1726882099.51404: variable 'ansible_distribution_major_version' from source: facts 7557 1726882099.51414: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882099.51498: variable 'network_state' from source: role '' defaults 7557 1726882099.51506: Evaluated conditional (network_state != {}): False 7557 1726882099.51509: when evaluation is False, skipping this task 7557 1726882099.51511: _execute() done 7557 1726882099.51514: dumping result to json 7557 1726882099.51516: done dumping result, returning 7557 1726882099.51524: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-ed48-b3a5-00000000007a] 7557 1726882099.51529: sending task result for task 12673a56-9f93-ed48-b3a5-00000000007a 7557 1726882099.51614: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000007a 7557 1726882099.51617: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882099.51703: no more pending results, returning what we have 7557 1726882099.51706: results queue empty 7557 1726882099.51707: checking for any_errors_fatal 7557 1726882099.51714: done checking for any_errors_fatal 7557 1726882099.51715: checking for max_fail_percentage 7557 1726882099.51716: done checking for max_fail_percentage 7557 1726882099.51717: checking to see if all hosts have failed and the running result is not ok 7557 1726882099.51718: done checking to see if all hosts have failed 7557 1726882099.51719: getting the remaining hosts for this loop 7557 1726882099.51720: done getting the remaining hosts for this loop 7557 1726882099.51723: getting the next task for host managed_node3 7557 1726882099.51730: done getting next task for host managed_node3 7557 1726882099.51735: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882099.51737: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882099.51753: getting variables 7557 1726882099.51754: in VariableManager get_vars() 7557 1726882099.51796: Calling all_inventory to load vars for managed_node3 7557 1726882099.51799: Calling groups_inventory to load vars for managed_node3 7557 1726882099.51801: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882099.51809: Calling all_plugins_play to load vars for managed_node3 7557 1726882099.51812: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882099.51814: Calling groups_plugins_play to load vars for managed_node3 7557 1726882099.52554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882099.54609: done with get_vars() 7557 1726882099.54632: done getting variables 7557 1726882099.54686: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:28:19 -0400 (0:00:00.047) 0:00:25.400 ****** 7557 1726882099.54720: entering _queue_task() for managed_node3/debug 7557 1726882099.55201: worker is 1 (out of 1 available) 7557 1726882099.55211: exiting _queue_task() for managed_node3/debug 7557 1726882099.55222: done queuing things up, now waiting for results queue to drain 7557 1726882099.55223: waiting for pending results... 7557 1726882099.55411: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882099.55454: in run() - task 12673a56-9f93-ed48-b3a5-00000000007b 7557 1726882099.55473: variable 'ansible_search_path' from source: unknown 7557 1726882099.55479: variable 'ansible_search_path' from source: unknown 7557 1726882099.55521: calling self._execute() 7557 1726882099.55623: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.55636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.55649: variable 'omit' from source: magic vars 7557 1726882099.56619: variable 'ansible_distribution_major_version' from source: facts 7557 1726882099.56623: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882099.56625: variable 'omit' from source: magic vars 7557 1726882099.56644: variable 'omit' from source: magic vars 7557 1726882099.56841: variable 'omit' from source: magic vars 7557 1726882099.57058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882099.57062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882099.57064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882099.57180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882099.57203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882099.57335: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882099.57602: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.57605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.57644: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882099.57722: Set connection var ansible_shell_executable to /bin/sh 7557 1726882099.57729: Set connection var ansible_shell_type to sh 7557 1726882099.57738: Set connection var ansible_pipelining to False 7557 1726882099.57744: Set connection var ansible_connection to ssh 7557 1726882099.57752: Set connection var ansible_timeout to 10 7557 1726882099.57779: variable 'ansible_shell_executable' from source: unknown 7557 1726882099.57825: variable 'ansible_connection' from source: unknown 7557 1726882099.57833: variable 'ansible_module_compression' from source: unknown 7557 1726882099.57839: variable 'ansible_shell_type' from source: unknown 7557 1726882099.57845: variable 'ansible_shell_executable' from source: unknown 7557 1726882099.57852: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.58036: variable 'ansible_pipelining' from source: unknown 7557 1726882099.58039: variable 'ansible_timeout' from source: unknown 7557 1726882099.58041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.58196: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882099.58214: variable 'omit' from source: magic vars 7557 1726882099.58260: starting attempt loop 7557 1726882099.58270: running the handler 7557 1726882099.58529: variable '__network_connections_result' from source: set_fact 7557 1726882099.58637: handler run complete 7557 1726882099.58714: attempt loop complete, returning result 7557 1726882099.58900: _execute() done 7557 1726882099.58904: dumping result to json 7557 1726882099.58907: done dumping result, returning 7557 1726882099.58910: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-ed48-b3a5-00000000007b] 7557 1726882099.58912: sending task result for task 12673a56-9f93-ed48-b3a5-00000000007b 7557 1726882099.58983: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000007b 7557 1726882099.58986: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7557 1726882099.59054: no more pending results, returning what we have 7557 1726882099.59058: results queue empty 7557 1726882099.59059: checking for any_errors_fatal 7557 1726882099.59066: done checking for any_errors_fatal 7557 1726882099.59067: checking for max_fail_percentage 7557 1726882099.59070: done checking for max_fail_percentage 7557 1726882099.59071: checking to see if all hosts have failed and the running result is not ok 7557 1726882099.59072: done checking to see if all hosts have failed 7557 1726882099.59072: getting the remaining hosts for this loop 7557 1726882099.59074: done getting the remaining hosts for this loop 7557 1726882099.59078: getting the next task for host managed_node3 7557 1726882099.59085: done getting next task for host managed_node3 7557 1726882099.59089: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882099.59096: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882099.59109: getting variables 7557 1726882099.59110: in VariableManager get_vars() 7557 1726882099.59162: Calling all_inventory to load vars for managed_node3 7557 1726882099.59166: Calling groups_inventory to load vars for managed_node3 7557 1726882099.59169: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882099.59179: Calling all_plugins_play to load vars for managed_node3 7557 1726882099.59182: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882099.59185: Calling groups_plugins_play to load vars for managed_node3 7557 1726882099.62410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882099.65776: done with get_vars() 7557 1726882099.65818: done getting variables 7557 1726882099.65883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:28:19 -0400 (0:00:00.117) 0:00:25.517 ****** 7557 1726882099.66443: entering _queue_task() for managed_node3/debug 7557 1726882099.67439: worker is 1 (out of 1 available) 7557 1726882099.67448: exiting _queue_task() for managed_node3/debug 7557 1726882099.67460: done queuing things up, now waiting for results queue to drain 7557 1726882099.67461: waiting for pending results... 7557 1726882099.67915: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882099.67921: in run() - task 12673a56-9f93-ed48-b3a5-00000000007c 7557 1726882099.68026: variable 'ansible_search_path' from source: unknown 7557 1726882099.68035: variable 'ansible_search_path' from source: unknown 7557 1726882099.68078: calling self._execute() 7557 1726882099.68302: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.68518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.68522: variable 'omit' from source: magic vars 7557 1726882099.69182: variable 'ansible_distribution_major_version' from source: facts 7557 1726882099.69206: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882099.69219: variable 'omit' from source: magic vars 7557 1726882099.69398: variable 'omit' from source: magic vars 7557 1726882099.69443: variable 'omit' from source: magic vars 7557 1726882099.69490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882099.69535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882099.69776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882099.69780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882099.69782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882099.69785: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882099.69788: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.69790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.69980: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882099.70301: Set connection var ansible_shell_executable to /bin/sh 7557 1726882099.70305: Set connection var ansible_shell_type to sh 7557 1726882099.70307: Set connection var ansible_pipelining to False 7557 1726882099.70310: Set connection var ansible_connection to ssh 7557 1726882099.70312: Set connection var ansible_timeout to 10 7557 1726882099.70314: variable 'ansible_shell_executable' from source: unknown 7557 1726882099.70316: variable 'ansible_connection' from source: unknown 7557 1726882099.70319: variable 'ansible_module_compression' from source: unknown 7557 1726882099.70321: variable 'ansible_shell_type' from source: unknown 7557 1726882099.70323: variable 'ansible_shell_executable' from source: unknown 7557 1726882099.70325: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.70327: variable 'ansible_pipelining' from source: unknown 7557 1726882099.70329: variable 'ansible_timeout' from source: unknown 7557 1726882099.70331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.70450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882099.70530: variable 'omit' from source: magic vars 7557 1726882099.70539: starting attempt loop 7557 1726882099.70548: running the handler 7557 1726882099.70704: variable '__network_connections_result' from source: set_fact 7557 1726882099.70788: variable '__network_connections_result' from source: set_fact 7557 1726882099.71061: handler run complete 7557 1726882099.71091: attempt loop complete, returning result 7557 1726882099.71172: _execute() done 7557 1726882099.71180: dumping result to json 7557 1726882099.71189: done dumping result, returning 7557 1726882099.71207: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-ed48-b3a5-00000000007c] 7557 1726882099.71217: sending task result for task 12673a56-9f93-ed48-b3a5-00000000007c ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7557 1726882099.71417: no more pending results, returning what we have 7557 1726882099.71421: results queue empty 7557 1726882099.71423: checking for any_errors_fatal 7557 1726882099.71430: done checking for any_errors_fatal 7557 1726882099.71431: checking for max_fail_percentage 7557 1726882099.71434: done checking for max_fail_percentage 7557 1726882099.71435: checking to see if all hosts have failed and the running result is not ok 7557 1726882099.71436: done checking to see if all hosts have failed 7557 1726882099.71436: getting the remaining hosts for this loop 7557 1726882099.71438: done getting the remaining hosts for this loop 7557 1726882099.71441: getting the next task for host managed_node3 7557 1726882099.71448: done getting next task for host managed_node3 7557 1726882099.71452: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882099.71456: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882099.71467: getting variables 7557 1726882099.71468: in VariableManager get_vars() 7557 1726882099.71827: Calling all_inventory to load vars for managed_node3 7557 1726882099.71830: Calling groups_inventory to load vars for managed_node3 7557 1726882099.71833: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882099.71844: Calling all_plugins_play to load vars for managed_node3 7557 1726882099.71847: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882099.71850: Calling groups_plugins_play to load vars for managed_node3 7557 1726882099.72508: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000007c 7557 1726882099.72511: WORKER PROCESS EXITING 7557 1726882099.73645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882099.76169: done with get_vars() 7557 1726882099.76196: done getting variables 7557 1726882099.76240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:28:19 -0400 (0:00:00.098) 0:00:25.615 ****** 7557 1726882099.76270: entering _queue_task() for managed_node3/debug 7557 1726882099.76521: worker is 1 (out of 1 available) 7557 1726882099.76533: exiting _queue_task() for managed_node3/debug 7557 1726882099.76546: done queuing things up, now waiting for results queue to drain 7557 1726882099.76548: waiting for pending results... 7557 1726882099.76725: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882099.76824: in run() - task 12673a56-9f93-ed48-b3a5-00000000007d 7557 1726882099.76835: variable 'ansible_search_path' from source: unknown 7557 1726882099.76838: variable 'ansible_search_path' from source: unknown 7557 1726882099.76866: calling self._execute() 7557 1726882099.76944: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.76950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.76959: variable 'omit' from source: magic vars 7557 1726882099.77227: variable 'ansible_distribution_major_version' from source: facts 7557 1726882099.77237: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882099.77323: variable 'network_state' from source: role '' defaults 7557 1726882099.77333: Evaluated conditional (network_state != {}): False 7557 1726882099.77336: when evaluation is False, skipping this task 7557 1726882099.77339: _execute() done 7557 1726882099.77341: dumping result to json 7557 1726882099.77343: done dumping result, returning 7557 1726882099.77351: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-ed48-b3a5-00000000007d] 7557 1726882099.77356: sending task result for task 12673a56-9f93-ed48-b3a5-00000000007d 7557 1726882099.77440: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000007d 7557 1726882099.77443: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7557 1726882099.77487: no more pending results, returning what we have 7557 1726882099.77491: results queue empty 7557 1726882099.77496: checking for any_errors_fatal 7557 1726882099.77507: done checking for any_errors_fatal 7557 1726882099.77508: checking for max_fail_percentage 7557 1726882099.77509: done checking for max_fail_percentage 7557 1726882099.77510: checking to see if all hosts have failed and the running result is not ok 7557 1726882099.77511: done checking to see if all hosts have failed 7557 1726882099.77512: getting the remaining hosts for this loop 7557 1726882099.77513: done getting the remaining hosts for this loop 7557 1726882099.77516: getting the next task for host managed_node3 7557 1726882099.77522: done getting next task for host managed_node3 7557 1726882099.77526: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882099.77529: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882099.77549: getting variables 7557 1726882099.77550: in VariableManager get_vars() 7557 1726882099.77608: Calling all_inventory to load vars for managed_node3 7557 1726882099.77611: Calling groups_inventory to load vars for managed_node3 7557 1726882099.77614: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882099.77632: Calling all_plugins_play to load vars for managed_node3 7557 1726882099.77635: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882099.77641: Calling groups_plugins_play to load vars for managed_node3 7557 1726882099.79565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882099.80495: done with get_vars() 7557 1726882099.80514: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:28:19 -0400 (0:00:00.043) 0:00:25.658 ****** 7557 1726882099.80585: entering _queue_task() for managed_node3/ping 7557 1726882099.80905: worker is 1 (out of 1 available) 7557 1726882099.80919: exiting _queue_task() for managed_node3/ping 7557 1726882099.80933: done queuing things up, now waiting for results queue to drain 7557 1726882099.80935: waiting for pending results... 7557 1726882099.81336: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882099.81440: in run() - task 12673a56-9f93-ed48-b3a5-00000000007e 7557 1726882099.81468: variable 'ansible_search_path' from source: unknown 7557 1726882099.81477: variable 'ansible_search_path' from source: unknown 7557 1726882099.81523: calling self._execute() 7557 1726882099.81648: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.81664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.81752: variable 'omit' from source: magic vars 7557 1726882099.82104: variable 'ansible_distribution_major_version' from source: facts 7557 1726882099.82127: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882099.82139: variable 'omit' from source: magic vars 7557 1726882099.82208: variable 'omit' from source: magic vars 7557 1726882099.82245: variable 'omit' from source: magic vars 7557 1726882099.82287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882099.82337: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882099.82360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882099.82380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882099.82425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882099.82446: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882099.82457: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.82508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.82586: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882099.82601: Set connection var ansible_shell_executable to /bin/sh 7557 1726882099.82608: Set connection var ansible_shell_type to sh 7557 1726882099.82627: Set connection var ansible_pipelining to False 7557 1726882099.82641: Set connection var ansible_connection to ssh 7557 1726882099.82653: Set connection var ansible_timeout to 10 7557 1726882099.82683: variable 'ansible_shell_executable' from source: unknown 7557 1726882099.82724: variable 'ansible_connection' from source: unknown 7557 1726882099.82727: variable 'ansible_module_compression' from source: unknown 7557 1726882099.82729: variable 'ansible_shell_type' from source: unknown 7557 1726882099.82731: variable 'ansible_shell_executable' from source: unknown 7557 1726882099.82733: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882099.82734: variable 'ansible_pipelining' from source: unknown 7557 1726882099.82736: variable 'ansible_timeout' from source: unknown 7557 1726882099.82738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882099.83018: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882099.83061: variable 'omit' from source: magic vars 7557 1726882099.83064: starting attempt loop 7557 1726882099.83066: running the handler 7557 1726882099.83069: _low_level_execute_command(): starting 7557 1726882099.83095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882099.83781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882099.83815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.83862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.83924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882099.83927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882099.83942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.84031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.85712: stdout chunk (state=3): >>>/root <<< 7557 1726882099.85883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.85887: stderr chunk (state=3): >>><<< 7557 1726882099.85889: stdout chunk (state=3): >>><<< 7557 1726882099.86033: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882099.86038: _low_level_execute_command(): starting 7557 1726882099.86041: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265 `" && echo ansible-tmp-1726882099.8593166-8574-264635250852265="` echo /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265 `" ) && sleep 0' 7557 1726882099.86711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882099.86814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882099.87016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.87100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.88956: stdout chunk (state=3): >>>ansible-tmp-1726882099.8593166-8574-264635250852265=/root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265 <<< 7557 1726882099.89137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.89141: stdout chunk (state=3): >>><<< 7557 1726882099.89143: stderr chunk (state=3): >>><<< 7557 1726882099.89302: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882099.8593166-8574-264635250852265=/root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882099.89305: variable 'ansible_module_compression' from source: unknown 7557 1726882099.89308: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7557 1726882099.89316: variable 'ansible_facts' from source: unknown 7557 1726882099.89412: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/AnsiballZ_ping.py 7557 1726882099.89546: Sending initial data 7557 1726882099.89640: Sent initial data (151 bytes) 7557 1726882099.90126: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882099.90130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882099.90166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882099.90169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882099.90173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882099.90176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.90227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882099.90232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.90278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.91791: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7557 1726882099.91797: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882099.91832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882099.91876: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmptg3ul_hm /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/AnsiballZ_ping.py <<< 7557 1726882099.91882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/AnsiballZ_ping.py" <<< 7557 1726882099.91923: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmptg3ul_hm" to remote "/root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/AnsiballZ_ping.py" <<< 7557 1726882099.92443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.92485: stderr chunk (state=3): >>><<< 7557 1726882099.92488: stdout chunk (state=3): >>><<< 7557 1726882099.92509: done transferring module to remote 7557 1726882099.92517: _low_level_execute_command(): starting 7557 1726882099.92522: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/ /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/AnsiballZ_ping.py && sleep 0' 7557 1726882099.92948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.92951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882099.92954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882099.92956: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882099.92958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.93018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882099.93021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.93059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882099.94752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882099.94776: stderr chunk (state=3): >>><<< 7557 1726882099.94779: stdout chunk (state=3): >>><<< 7557 1726882099.94795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882099.94804: _low_level_execute_command(): starting 7557 1726882099.94809: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/AnsiballZ_ping.py && sleep 0' 7557 1726882099.95246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882099.95251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882099.95253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.95256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882099.95258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882099.95308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882099.95312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882099.95367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.09957: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7557 1726882100.11180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882100.11208: stderr chunk (state=3): >>><<< 7557 1726882100.11212: stdout chunk (state=3): >>><<< 7557 1726882100.11228: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882100.11249: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882100.11258: _low_level_execute_command(): starting 7557 1726882100.11264: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882099.8593166-8574-264635250852265/ > /dev/null 2>&1 && sleep 0' 7557 1726882100.11729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882100.11732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882100.11734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.11737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882100.11740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.11785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882100.11788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882100.11795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882100.11839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.13632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882100.13657: stderr chunk (state=3): >>><<< 7557 1726882100.13660: stdout chunk (state=3): >>><<< 7557 1726882100.13676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882100.13683: handler run complete 7557 1726882100.13698: attempt loop complete, returning result 7557 1726882100.13701: _execute() done 7557 1726882100.13703: dumping result to json 7557 1726882100.13705: done dumping result, returning 7557 1726882100.13713: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-ed48-b3a5-00000000007e] 7557 1726882100.13717: sending task result for task 12673a56-9f93-ed48-b3a5-00000000007e 7557 1726882100.13804: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000007e 7557 1726882100.13807: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7557 1726882100.13866: no more pending results, returning what we have 7557 1726882100.13869: results queue empty 7557 1726882100.13870: checking for any_errors_fatal 7557 1726882100.13876: done checking for any_errors_fatal 7557 1726882100.13876: checking for max_fail_percentage 7557 1726882100.13878: done checking for max_fail_percentage 7557 1726882100.13879: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.13879: done checking to see if all hosts have failed 7557 1726882100.13880: getting the remaining hosts for this loop 7557 1726882100.13881: done getting the remaining hosts for this loop 7557 1726882100.13884: getting the next task for host managed_node3 7557 1726882100.13897: done getting next task for host managed_node3 7557 1726882100.13899: ^ task is: TASK: meta (role_complete) 7557 1726882100.13902: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.13913: getting variables 7557 1726882100.13916: in VariableManager get_vars() 7557 1726882100.13965: Calling all_inventory to load vars for managed_node3 7557 1726882100.13968: Calling groups_inventory to load vars for managed_node3 7557 1726882100.13970: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.13978: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.13981: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.13983: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.14771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.19005: done with get_vars() 7557 1726882100.19021: done getting variables 7557 1726882100.19067: done queuing things up, now waiting for results queue to drain 7557 1726882100.19068: results queue empty 7557 1726882100.19069: checking for any_errors_fatal 7557 1726882100.19070: done checking for any_errors_fatal 7557 1726882100.19071: checking for max_fail_percentage 7557 1726882100.19072: done checking for max_fail_percentage 7557 1726882100.19072: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.19073: done checking to see if all hosts have failed 7557 1726882100.19073: getting the remaining hosts for this loop 7557 1726882100.19074: done getting the remaining hosts for this loop 7557 1726882100.19075: getting the next task for host managed_node3 7557 1726882100.19078: done getting next task for host managed_node3 7557 1726882100.19079: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7557 1726882100.19080: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.19081: getting variables 7557 1726882100.19082: in VariableManager get_vars() 7557 1726882100.19096: Calling all_inventory to load vars for managed_node3 7557 1726882100.19098: Calling groups_inventory to load vars for managed_node3 7557 1726882100.19100: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.19104: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.19106: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.19108: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.19717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.20552: done with get_vars() 7557 1726882100.20565: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:79 Friday 20 September 2024 21:28:20 -0400 (0:00:00.400) 0:00:26.059 ****** 7557 1726882100.20617: entering _queue_task() for managed_node3/include_tasks 7557 1726882100.20871: worker is 1 (out of 1 available) 7557 1726882100.20885: exiting _queue_task() for managed_node3/include_tasks 7557 1726882100.20903: done queuing things up, now waiting for results queue to drain 7557 1726882100.20905: waiting for pending results... 7557 1726882100.21076: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7557 1726882100.21144: in run() - task 12673a56-9f93-ed48-b3a5-0000000000ae 7557 1726882100.21156: variable 'ansible_search_path' from source: unknown 7557 1726882100.21185: calling self._execute() 7557 1726882100.21270: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.21275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.21285: variable 'omit' from source: magic vars 7557 1726882100.21573: variable 'ansible_distribution_major_version' from source: facts 7557 1726882100.21579: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882100.21585: _execute() done 7557 1726882100.21588: dumping result to json 7557 1726882100.21591: done dumping result, returning 7557 1726882100.21600: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-ed48-b3a5-0000000000ae] 7557 1726882100.21606: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ae 7557 1726882100.21704: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ae 7557 1726882100.21708: WORKER PROCESS EXITING 7557 1726882100.21735: no more pending results, returning what we have 7557 1726882100.21740: in VariableManager get_vars() 7557 1726882100.21791: Calling all_inventory to load vars for managed_node3 7557 1726882100.21798: Calling groups_inventory to load vars for managed_node3 7557 1726882100.21801: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.21815: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.21819: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.21822: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.22697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.23536: done with get_vars() 7557 1726882100.23549: variable 'ansible_search_path' from source: unknown 7557 1726882100.23558: we have included files to process 7557 1726882100.23559: generating all_blocks data 7557 1726882100.23560: done generating all_blocks data 7557 1726882100.23564: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882100.23565: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882100.23567: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882100.23816: in VariableManager get_vars() 7557 1726882100.23834: done with get_vars() 7557 1726882100.24257: done processing included file 7557 1726882100.24258: iterating over new_blocks loaded from include file 7557 1726882100.24259: in VariableManager get_vars() 7557 1726882100.24273: done with get_vars() 7557 1726882100.24274: filtering new block on tags 7557 1726882100.24298: done filtering new block on tags 7557 1726882100.24300: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7557 1726882100.24303: extending task lists for all hosts with included blocks 7557 1726882100.26936: done extending task lists 7557 1726882100.26938: done processing included files 7557 1726882100.26939: results queue empty 7557 1726882100.26939: checking for any_errors_fatal 7557 1726882100.26940: done checking for any_errors_fatal 7557 1726882100.26941: checking for max_fail_percentage 7557 1726882100.26942: done checking for max_fail_percentage 7557 1726882100.26942: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.26943: done checking to see if all hosts have failed 7557 1726882100.26943: getting the remaining hosts for this loop 7557 1726882100.26944: done getting the remaining hosts for this loop 7557 1726882100.26945: getting the next task for host managed_node3 7557 1726882100.26948: done getting next task for host managed_node3 7557 1726882100.26950: ^ task is: TASK: Ensure state in ["present", "absent"] 7557 1726882100.26951: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.26953: getting variables 7557 1726882100.26953: in VariableManager get_vars() 7557 1726882100.26969: Calling all_inventory to load vars for managed_node3 7557 1726882100.26971: Calling groups_inventory to load vars for managed_node3 7557 1726882100.26972: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.26977: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.26979: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.26980: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.27648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.28576: done with get_vars() 7557 1726882100.28591: done getting variables 7557 1726882100.28626: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:28:20 -0400 (0:00:00.080) 0:00:26.139 ****** 7557 1726882100.28649: entering _queue_task() for managed_node3/fail 7557 1726882100.28908: worker is 1 (out of 1 available) 7557 1726882100.28922: exiting _queue_task() for managed_node3/fail 7557 1726882100.28935: done queuing things up, now waiting for results queue to drain 7557 1726882100.28937: waiting for pending results... 7557 1726882100.29112: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7557 1726882100.29174: in run() - task 12673a56-9f93-ed48-b3a5-000000000dff 7557 1726882100.29187: variable 'ansible_search_path' from source: unknown 7557 1726882100.29190: variable 'ansible_search_path' from source: unknown 7557 1726882100.29222: calling self._execute() 7557 1726882100.29302: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.29307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.29317: variable 'omit' from source: magic vars 7557 1726882100.29583: variable 'ansible_distribution_major_version' from source: facts 7557 1726882100.29597: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882100.29688: variable 'state' from source: include params 7557 1726882100.29696: Evaluated conditional (state not in ["present", "absent"]): False 7557 1726882100.29699: when evaluation is False, skipping this task 7557 1726882100.29701: _execute() done 7557 1726882100.29704: dumping result to json 7557 1726882100.29709: done dumping result, returning 7557 1726882100.29711: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-ed48-b3a5-000000000dff] 7557 1726882100.29721: sending task result for task 12673a56-9f93-ed48-b3a5-000000000dff 7557 1726882100.29798: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000dff 7557 1726882100.29801: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7557 1726882100.29866: no more pending results, returning what we have 7557 1726882100.29871: results queue empty 7557 1726882100.29872: checking for any_errors_fatal 7557 1726882100.29873: done checking for any_errors_fatal 7557 1726882100.29874: checking for max_fail_percentage 7557 1726882100.29875: done checking for max_fail_percentage 7557 1726882100.29876: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.29877: done checking to see if all hosts have failed 7557 1726882100.29878: getting the remaining hosts for this loop 7557 1726882100.29879: done getting the remaining hosts for this loop 7557 1726882100.29882: getting the next task for host managed_node3 7557 1726882100.29887: done getting next task for host managed_node3 7557 1726882100.29890: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882100.29896: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.29899: getting variables 7557 1726882100.29901: in VariableManager get_vars() 7557 1726882100.29945: Calling all_inventory to load vars for managed_node3 7557 1726882100.29948: Calling groups_inventory to load vars for managed_node3 7557 1726882100.29950: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.29960: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.29963: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.29966: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.30709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.31759: done with get_vars() 7557 1726882100.31779: done getting variables 7557 1726882100.31839: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:28:20 -0400 (0:00:00.032) 0:00:26.171 ****** 7557 1726882100.31875: entering _queue_task() for managed_node3/fail 7557 1726882100.32168: worker is 1 (out of 1 available) 7557 1726882100.32299: exiting _queue_task() for managed_node3/fail 7557 1726882100.32311: done queuing things up, now waiting for results queue to drain 7557 1726882100.32313: waiting for pending results... 7557 1726882100.32573: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882100.32642: in run() - task 12673a56-9f93-ed48-b3a5-000000000e00 7557 1726882100.32654: variable 'ansible_search_path' from source: unknown 7557 1726882100.32658: variable 'ansible_search_path' from source: unknown 7557 1726882100.32692: calling self._execute() 7557 1726882100.32780: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.32786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.32797: variable 'omit' from source: magic vars 7557 1726882100.33085: variable 'ansible_distribution_major_version' from source: facts 7557 1726882100.33094: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882100.33195: variable 'type' from source: play vars 7557 1726882100.33201: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7557 1726882100.33204: when evaluation is False, skipping this task 7557 1726882100.33207: _execute() done 7557 1726882100.33210: dumping result to json 7557 1726882100.33212: done dumping result, returning 7557 1726882100.33218: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-ed48-b3a5-000000000e00] 7557 1726882100.33225: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e00 7557 1726882100.33303: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e00 7557 1726882100.33306: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7557 1726882100.33373: no more pending results, returning what we have 7557 1726882100.33378: results queue empty 7557 1726882100.33379: checking for any_errors_fatal 7557 1726882100.33389: done checking for any_errors_fatal 7557 1726882100.33390: checking for max_fail_percentage 7557 1726882100.33391: done checking for max_fail_percentage 7557 1726882100.33394: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.33395: done checking to see if all hosts have failed 7557 1726882100.33398: getting the remaining hosts for this loop 7557 1726882100.33400: done getting the remaining hosts for this loop 7557 1726882100.33404: getting the next task for host managed_node3 7557 1726882100.33412: done getting next task for host managed_node3 7557 1726882100.33415: ^ task is: TASK: Include the task 'show_interfaces.yml' 7557 1726882100.33426: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.33433: getting variables 7557 1726882100.33435: in VariableManager get_vars() 7557 1726882100.33490: Calling all_inventory to load vars for managed_node3 7557 1726882100.33495: Calling groups_inventory to load vars for managed_node3 7557 1726882100.33497: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.33510: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.33513: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.33518: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.34670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.36140: done with get_vars() 7557 1726882100.36163: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:28:20 -0400 (0:00:00.043) 0:00:26.215 ****** 7557 1726882100.36255: entering _queue_task() for managed_node3/include_tasks 7557 1726882100.36539: worker is 1 (out of 1 available) 7557 1726882100.36550: exiting _queue_task() for managed_node3/include_tasks 7557 1726882100.36562: done queuing things up, now waiting for results queue to drain 7557 1726882100.36563: waiting for pending results... 7557 1726882100.36920: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7557 1726882100.37001: in run() - task 12673a56-9f93-ed48-b3a5-000000000e01 7557 1726882100.37006: variable 'ansible_search_path' from source: unknown 7557 1726882100.37008: variable 'ansible_search_path' from source: unknown 7557 1726882100.37035: calling self._execute() 7557 1726882100.37143: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.37199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.37202: variable 'omit' from source: magic vars 7557 1726882100.37542: variable 'ansible_distribution_major_version' from source: facts 7557 1726882100.37564: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882100.37574: _execute() done 7557 1726882100.37582: dumping result to json 7557 1726882100.37589: done dumping result, returning 7557 1726882100.37601: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-ed48-b3a5-000000000e01] 7557 1726882100.37610: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e01 7557 1726882100.37732: no more pending results, returning what we have 7557 1726882100.37738: in VariableManager get_vars() 7557 1726882100.37801: Calling all_inventory to load vars for managed_node3 7557 1726882100.37804: Calling groups_inventory to load vars for managed_node3 7557 1726882100.37807: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.37821: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.37824: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.37827: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.38676: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e01 7557 1726882100.38679: WORKER PROCESS EXITING 7557 1726882100.39357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.41563: done with get_vars() 7557 1726882100.41590: variable 'ansible_search_path' from source: unknown 7557 1726882100.41592: variable 'ansible_search_path' from source: unknown 7557 1726882100.41637: we have included files to process 7557 1726882100.41639: generating all_blocks data 7557 1726882100.41640: done generating all_blocks data 7557 1726882100.41645: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882100.41647: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882100.41649: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882100.41762: in VariableManager get_vars() 7557 1726882100.41800: done with get_vars() 7557 1726882100.41919: done processing included file 7557 1726882100.41921: iterating over new_blocks loaded from include file 7557 1726882100.41923: in VariableManager get_vars() 7557 1726882100.41947: done with get_vars() 7557 1726882100.41949: filtering new block on tags 7557 1726882100.41967: done filtering new block on tags 7557 1726882100.41969: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7557 1726882100.41975: extending task lists for all hosts with included blocks 7557 1726882100.42386: done extending task lists 7557 1726882100.42387: done processing included files 7557 1726882100.42388: results queue empty 7557 1726882100.42389: checking for any_errors_fatal 7557 1726882100.42396: done checking for any_errors_fatal 7557 1726882100.42397: checking for max_fail_percentage 7557 1726882100.42398: done checking for max_fail_percentage 7557 1726882100.42399: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.42399: done checking to see if all hosts have failed 7557 1726882100.42400: getting the remaining hosts for this loop 7557 1726882100.42401: done getting the remaining hosts for this loop 7557 1726882100.42404: getting the next task for host managed_node3 7557 1726882100.42408: done getting next task for host managed_node3 7557 1726882100.42410: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7557 1726882100.42413: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.42416: getting variables 7557 1726882100.42417: in VariableManager get_vars() 7557 1726882100.42436: Calling all_inventory to load vars for managed_node3 7557 1726882100.42438: Calling groups_inventory to load vars for managed_node3 7557 1726882100.42440: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.42446: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.42448: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.42451: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.43743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.45271: done with get_vars() 7557 1726882100.45299: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:28:20 -0400 (0:00:00.091) 0:00:26.306 ****** 7557 1726882100.45376: entering _queue_task() for managed_node3/include_tasks 7557 1726882100.45713: worker is 1 (out of 1 available) 7557 1726882100.45724: exiting _queue_task() for managed_node3/include_tasks 7557 1726882100.45735: done queuing things up, now waiting for results queue to drain 7557 1726882100.45736: waiting for pending results... 7557 1726882100.46116: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7557 1726882100.46137: in run() - task 12673a56-9f93-ed48-b3a5-000000001030 7557 1726882100.46160: variable 'ansible_search_path' from source: unknown 7557 1726882100.46168: variable 'ansible_search_path' from source: unknown 7557 1726882100.46219: calling self._execute() 7557 1726882100.46336: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.46347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.46362: variable 'omit' from source: magic vars 7557 1726882100.46756: variable 'ansible_distribution_major_version' from source: facts 7557 1726882100.46771: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882100.46781: _execute() done 7557 1726882100.46789: dumping result to json 7557 1726882100.46799: done dumping result, returning 7557 1726882100.46811: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-ed48-b3a5-000000001030] 7557 1726882100.46825: sending task result for task 12673a56-9f93-ed48-b3a5-000000001030 7557 1726882100.47024: no more pending results, returning what we have 7557 1726882100.47029: in VariableManager get_vars() 7557 1726882100.47090: Calling all_inventory to load vars for managed_node3 7557 1726882100.47097: Calling groups_inventory to load vars for managed_node3 7557 1726882100.47100: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.47115: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.47124: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.47128: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.47717: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001030 7557 1726882100.47721: WORKER PROCESS EXITING 7557 1726882100.48642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.50954: done with get_vars() 7557 1726882100.50977: variable 'ansible_search_path' from source: unknown 7557 1726882100.50979: variable 'ansible_search_path' from source: unknown 7557 1726882100.51044: we have included files to process 7557 1726882100.51046: generating all_blocks data 7557 1726882100.51047: done generating all_blocks data 7557 1726882100.51048: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882100.51049: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882100.51051: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882100.51314: done processing included file 7557 1726882100.51316: iterating over new_blocks loaded from include file 7557 1726882100.51318: in VariableManager get_vars() 7557 1726882100.51343: done with get_vars() 7557 1726882100.51345: filtering new block on tags 7557 1726882100.51362: done filtering new block on tags 7557 1726882100.51364: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7557 1726882100.51369: extending task lists for all hosts with included blocks 7557 1726882100.51523: done extending task lists 7557 1726882100.51525: done processing included files 7557 1726882100.51526: results queue empty 7557 1726882100.51526: checking for any_errors_fatal 7557 1726882100.51529: done checking for any_errors_fatal 7557 1726882100.51530: checking for max_fail_percentage 7557 1726882100.51531: done checking for max_fail_percentage 7557 1726882100.51532: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.51533: done checking to see if all hosts have failed 7557 1726882100.51533: getting the remaining hosts for this loop 7557 1726882100.51534: done getting the remaining hosts for this loop 7557 1726882100.51537: getting the next task for host managed_node3 7557 1726882100.51541: done getting next task for host managed_node3 7557 1726882100.51543: ^ task is: TASK: Gather current interface info 7557 1726882100.51546: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.51548: getting variables 7557 1726882100.51549: in VariableManager get_vars() 7557 1726882100.51565: Calling all_inventory to load vars for managed_node3 7557 1726882100.51567: Calling groups_inventory to load vars for managed_node3 7557 1726882100.51569: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.51574: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.51576: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.51579: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.52768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882100.54829: done with get_vars() 7557 1726882100.54852: done getting variables 7557 1726882100.54897: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:28:20 -0400 (0:00:00.095) 0:00:26.402 ****** 7557 1726882100.54930: entering _queue_task() for managed_node3/command 7557 1726882100.55247: worker is 1 (out of 1 available) 7557 1726882100.55261: exiting _queue_task() for managed_node3/command 7557 1726882100.55275: done queuing things up, now waiting for results queue to drain 7557 1726882100.55277: waiting for pending results... 7557 1726882100.55523: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7557 1726882100.55652: in run() - task 12673a56-9f93-ed48-b3a5-000000001067 7557 1726882100.55672: variable 'ansible_search_path' from source: unknown 7557 1726882100.55684: variable 'ansible_search_path' from source: unknown 7557 1726882100.55726: calling self._execute() 7557 1726882100.55830: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.55841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.55854: variable 'omit' from source: magic vars 7557 1726882100.56227: variable 'ansible_distribution_major_version' from source: facts 7557 1726882100.56244: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882100.56253: variable 'omit' from source: magic vars 7557 1726882100.56343: variable 'omit' from source: magic vars 7557 1726882100.56434: variable 'omit' from source: magic vars 7557 1726882100.56755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882100.56956: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882100.56959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882100.56962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882100.56964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882100.56966: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882100.56968: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.56971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.57076: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882100.57095: Set connection var ansible_shell_executable to /bin/sh 7557 1726882100.57105: Set connection var ansible_shell_type to sh 7557 1726882100.57115: Set connection var ansible_pipelining to False 7557 1726882100.57122: Set connection var ansible_connection to ssh 7557 1726882100.57130: Set connection var ansible_timeout to 10 7557 1726882100.57155: variable 'ansible_shell_executable' from source: unknown 7557 1726882100.57163: variable 'ansible_connection' from source: unknown 7557 1726882100.57170: variable 'ansible_module_compression' from source: unknown 7557 1726882100.57175: variable 'ansible_shell_type' from source: unknown 7557 1726882100.57182: variable 'ansible_shell_executable' from source: unknown 7557 1726882100.57194: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882100.57204: variable 'ansible_pipelining' from source: unknown 7557 1726882100.57211: variable 'ansible_timeout' from source: unknown 7557 1726882100.57220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882100.57421: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882100.57438: variable 'omit' from source: magic vars 7557 1726882100.57447: starting attempt loop 7557 1726882100.57599: running the handler 7557 1726882100.57602: _low_level_execute_command(): starting 7557 1726882100.57604: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882100.58651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882100.58665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882100.58730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.58791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882100.58816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882100.58846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882100.58956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.60815: stdout chunk (state=3): >>>/root <<< 7557 1726882100.60882: stdout chunk (state=3): >>><<< 7557 1726882100.60909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882100.60919: stderr chunk (state=3): >>><<< 7557 1726882100.60957: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882100.61059: _low_level_execute_command(): starting 7557 1726882100.61063: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414 `" && echo ansible-tmp-1726882100.609706-8598-950057662414="` echo /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414 `" ) && sleep 0' 7557 1726882100.61938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882100.61953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882100.62038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.62075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882100.62209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882100.62212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882100.62290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.64155: stdout chunk (state=3): >>>ansible-tmp-1726882100.609706-8598-950057662414=/root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414 <<< 7557 1726882100.64279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882100.64312: stderr chunk (state=3): >>><<< 7557 1726882100.64415: stdout chunk (state=3): >>><<< 7557 1726882100.64601: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882100.609706-8598-950057662414=/root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882100.64604: variable 'ansible_module_compression' from source: unknown 7557 1726882100.64634: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882100.64682: variable 'ansible_facts' from source: unknown 7557 1726882100.64905: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/AnsiballZ_command.py 7557 1726882100.65178: Sending initial data 7557 1726882100.65188: Sent initial data (150 bytes) 7557 1726882100.65855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882100.65877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882100.65890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.65942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882100.65964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882100.66044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.67590: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882100.67691: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882100.67738: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpxl41as12 /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/AnsiballZ_command.py <<< 7557 1726882100.67748: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/AnsiballZ_command.py" <<< 7557 1726882100.67904: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpxl41as12" to remote "/root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/AnsiballZ_command.py" <<< 7557 1726882100.69184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882100.69257: stderr chunk (state=3): >>><<< 7557 1726882100.69267: stdout chunk (state=3): >>><<< 7557 1726882100.69504: done transferring module to remote 7557 1726882100.69508: _low_level_execute_command(): starting 7557 1726882100.69511: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/ /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/AnsiballZ_command.py && sleep 0' 7557 1726882100.70615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882100.70711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.70923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882100.70940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882100.71001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.72923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882100.72926: stdout chunk (state=3): >>><<< 7557 1726882100.72928: stderr chunk (state=3): >>><<< 7557 1726882100.72947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882100.72954: _low_level_execute_command(): starting 7557 1726882100.72962: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/AnsiballZ_command.py && sleep 0' 7557 1726882100.74011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882100.74112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882100.74269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.74280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882100.74310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882100.74326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882100.74374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.89489: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:28:20.890150", "end": "2024-09-20 21:28:20.893160", "delta": "0:00:00.003010", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882100.91213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882100.91217: stdout chunk (state=3): >>><<< 7557 1726882100.91220: stderr chunk (state=3): >>><<< 7557 1726882100.91224: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:28:20.890150", "end": "2024-09-20 21:28:20.893160", "delta": "0:00:00.003010", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882100.91256: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882100.91274: _low_level_execute_command(): starting 7557 1726882100.91286: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882100.609706-8598-950057662414/ > /dev/null 2>&1 && sleep 0' 7557 1726882100.92625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882100.92631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.92634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882100.92636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882100.92638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882100.92873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882100.92876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882100.94762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882100.94800: stderr chunk (state=3): >>><<< 7557 1726882100.94805: stdout chunk (state=3): >>><<< 7557 1726882100.95014: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882100.95017: handler run complete 7557 1726882100.95020: Evaluated conditional (False): False 7557 1726882100.95022: attempt loop complete, returning result 7557 1726882100.95024: _execute() done 7557 1726882100.95027: dumping result to json 7557 1726882100.95029: done dumping result, returning 7557 1726882100.95031: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-ed48-b3a5-000000001067] 7557 1726882100.95033: sending task result for task 12673a56-9f93-ed48-b3a5-000000001067 7557 1726882100.95264: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001067 7557 1726882100.95268: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003010", "end": "2024-09-20 21:28:20.893160", "rc": 0, "start": "2024-09-20 21:28:20.890150" } STDOUT: eth0 lo peerveth0 veth0 7557 1726882100.95360: no more pending results, returning what we have 7557 1726882100.95364: results queue empty 7557 1726882100.95366: checking for any_errors_fatal 7557 1726882100.95367: done checking for any_errors_fatal 7557 1726882100.95368: checking for max_fail_percentage 7557 1726882100.95370: done checking for max_fail_percentage 7557 1726882100.95371: checking to see if all hosts have failed and the running result is not ok 7557 1726882100.95372: done checking to see if all hosts have failed 7557 1726882100.95373: getting the remaining hosts for this loop 7557 1726882100.95374: done getting the remaining hosts for this loop 7557 1726882100.95378: getting the next task for host managed_node3 7557 1726882100.95601: done getting next task for host managed_node3 7557 1726882100.95604: ^ task is: TASK: Set current_interfaces 7557 1726882100.95610: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882100.95614: getting variables 7557 1726882100.95616: in VariableManager get_vars() 7557 1726882100.95667: Calling all_inventory to load vars for managed_node3 7557 1726882100.95669: Calling groups_inventory to load vars for managed_node3 7557 1726882100.95671: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882100.95682: Calling all_plugins_play to load vars for managed_node3 7557 1726882100.95685: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882100.95688: Calling groups_plugins_play to load vars for managed_node3 7557 1726882100.99240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882101.03330: done with get_vars() 7557 1726882101.03364: done getting variables 7557 1726882101.03650: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:28:21 -0400 (0:00:00.487) 0:00:26.889 ****** 7557 1726882101.03684: entering _queue_task() for managed_node3/set_fact 7557 1726882101.04349: worker is 1 (out of 1 available) 7557 1726882101.04361: exiting _queue_task() for managed_node3/set_fact 7557 1726882101.04376: done queuing things up, now waiting for results queue to drain 7557 1726882101.04377: waiting for pending results... 7557 1726882101.04914: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7557 1726882101.04919: in run() - task 12673a56-9f93-ed48-b3a5-000000001068 7557 1726882101.05115: variable 'ansible_search_path' from source: unknown 7557 1726882101.05122: variable 'ansible_search_path' from source: unknown 7557 1726882101.05149: calling self._execute() 7557 1726882101.05245: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.05251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.05300: variable 'omit' from source: magic vars 7557 1726882101.06021: variable 'ansible_distribution_major_version' from source: facts 7557 1726882101.06034: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882101.06040: variable 'omit' from source: magic vars 7557 1726882101.06088: variable 'omit' from source: magic vars 7557 1726882101.06398: variable '_current_interfaces' from source: set_fact 7557 1726882101.06499: variable 'omit' from source: magic vars 7557 1726882101.06502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882101.06732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882101.06751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882101.06767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882101.06781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882101.06813: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882101.06816: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.06818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.07125: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882101.07132: Set connection var ansible_shell_executable to /bin/sh 7557 1726882101.07135: Set connection var ansible_shell_type to sh 7557 1726882101.07140: Set connection var ansible_pipelining to False 7557 1726882101.07143: Set connection var ansible_connection to ssh 7557 1726882101.07188: Set connection var ansible_timeout to 10 7557 1726882101.07191: variable 'ansible_shell_executable' from source: unknown 7557 1726882101.07199: variable 'ansible_connection' from source: unknown 7557 1726882101.07201: variable 'ansible_module_compression' from source: unknown 7557 1726882101.07203: variable 'ansible_shell_type' from source: unknown 7557 1726882101.07205: variable 'ansible_shell_executable' from source: unknown 7557 1726882101.07207: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.07209: variable 'ansible_pipelining' from source: unknown 7557 1726882101.07211: variable 'ansible_timeout' from source: unknown 7557 1726882101.07213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.07648: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882101.07651: variable 'omit' from source: magic vars 7557 1726882101.07652: starting attempt loop 7557 1726882101.07654: running the handler 7557 1726882101.07656: handler run complete 7557 1726882101.07658: attempt loop complete, returning result 7557 1726882101.07659: _execute() done 7557 1726882101.07661: dumping result to json 7557 1726882101.07662: done dumping result, returning 7557 1726882101.07664: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-ed48-b3a5-000000001068] 7557 1726882101.07665: sending task result for task 12673a56-9f93-ed48-b3a5-000000001068 7557 1726882101.07730: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001068 7557 1726882101.07733: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7557 1726882101.07811: no more pending results, returning what we have 7557 1726882101.07815: results queue empty 7557 1726882101.07816: checking for any_errors_fatal 7557 1726882101.07829: done checking for any_errors_fatal 7557 1726882101.07830: checking for max_fail_percentage 7557 1726882101.07831: done checking for max_fail_percentage 7557 1726882101.07832: checking to see if all hosts have failed and the running result is not ok 7557 1726882101.07833: done checking to see if all hosts have failed 7557 1726882101.07834: getting the remaining hosts for this loop 7557 1726882101.07835: done getting the remaining hosts for this loop 7557 1726882101.07839: getting the next task for host managed_node3 7557 1726882101.07847: done getting next task for host managed_node3 7557 1726882101.07850: ^ task is: TASK: Show current_interfaces 7557 1726882101.07854: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882101.07858: getting variables 7557 1726882101.07860: in VariableManager get_vars() 7557 1726882101.07914: Calling all_inventory to load vars for managed_node3 7557 1726882101.07917: Calling groups_inventory to load vars for managed_node3 7557 1726882101.07919: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882101.07929: Calling all_plugins_play to load vars for managed_node3 7557 1726882101.07932: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882101.07934: Calling groups_plugins_play to load vars for managed_node3 7557 1726882101.10807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882101.14199: done with get_vars() 7557 1726882101.14232: done getting variables 7557 1726882101.14299: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:28:21 -0400 (0:00:00.107) 0:00:26.997 ****** 7557 1726882101.14447: entering _queue_task() for managed_node3/debug 7557 1726882101.15226: worker is 1 (out of 1 available) 7557 1726882101.15239: exiting _queue_task() for managed_node3/debug 7557 1726882101.15251: done queuing things up, now waiting for results queue to drain 7557 1726882101.15253: waiting for pending results... 7557 1726882101.15628: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7557 1726882101.16315: in run() - task 12673a56-9f93-ed48-b3a5-000000001031 7557 1726882101.16319: variable 'ansible_search_path' from source: unknown 7557 1726882101.16323: variable 'ansible_search_path' from source: unknown 7557 1726882101.16326: calling self._execute() 7557 1726882101.16328: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.16330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.16333: variable 'omit' from source: magic vars 7557 1726882101.17034: variable 'ansible_distribution_major_version' from source: facts 7557 1726882101.17074: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882101.17179: variable 'omit' from source: magic vars 7557 1726882101.17230: variable 'omit' from source: magic vars 7557 1726882101.17445: variable 'current_interfaces' from source: set_fact 7557 1726882101.17480: variable 'omit' from source: magic vars 7557 1726882101.17639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882101.17679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882101.17710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882101.17732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882101.17805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882101.17845: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882101.17889: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.17919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.18413: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882101.18421: Set connection var ansible_shell_executable to /bin/sh 7557 1726882101.18424: Set connection var ansible_shell_type to sh 7557 1726882101.18430: Set connection var ansible_pipelining to False 7557 1726882101.18433: Set connection var ansible_connection to ssh 7557 1726882101.18435: Set connection var ansible_timeout to 10 7557 1726882101.18461: variable 'ansible_shell_executable' from source: unknown 7557 1726882101.18464: variable 'ansible_connection' from source: unknown 7557 1726882101.18467: variable 'ansible_module_compression' from source: unknown 7557 1726882101.18469: variable 'ansible_shell_type' from source: unknown 7557 1726882101.18471: variable 'ansible_shell_executable' from source: unknown 7557 1726882101.18473: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.18475: variable 'ansible_pipelining' from source: unknown 7557 1726882101.18480: variable 'ansible_timeout' from source: unknown 7557 1726882101.18484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.19017: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882101.19026: variable 'omit' from source: magic vars 7557 1726882101.19031: starting attempt loop 7557 1726882101.19034: running the handler 7557 1726882101.19079: handler run complete 7557 1726882101.19096: attempt loop complete, returning result 7557 1726882101.19099: _execute() done 7557 1726882101.19499: dumping result to json 7557 1726882101.19502: done dumping result, returning 7557 1726882101.19505: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-ed48-b3a5-000000001031] 7557 1726882101.19507: sending task result for task 12673a56-9f93-ed48-b3a5-000000001031 7557 1726882101.19572: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001031 7557 1726882101.19575: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7557 1726882101.19627: no more pending results, returning what we have 7557 1726882101.19631: results queue empty 7557 1726882101.19632: checking for any_errors_fatal 7557 1726882101.19638: done checking for any_errors_fatal 7557 1726882101.19638: checking for max_fail_percentage 7557 1726882101.19640: done checking for max_fail_percentage 7557 1726882101.19641: checking to see if all hosts have failed and the running result is not ok 7557 1726882101.19642: done checking to see if all hosts have failed 7557 1726882101.19642: getting the remaining hosts for this loop 7557 1726882101.19645: done getting the remaining hosts for this loop 7557 1726882101.19648: getting the next task for host managed_node3 7557 1726882101.19656: done getting next task for host managed_node3 7557 1726882101.19659: ^ task is: TASK: Install iproute 7557 1726882101.19662: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882101.19667: getting variables 7557 1726882101.19668: in VariableManager get_vars() 7557 1726882101.19724: Calling all_inventory to load vars for managed_node3 7557 1726882101.19727: Calling groups_inventory to load vars for managed_node3 7557 1726882101.19730: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882101.19740: Calling all_plugins_play to load vars for managed_node3 7557 1726882101.19744: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882101.19746: Calling groups_plugins_play to load vars for managed_node3 7557 1726882101.22442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882101.24647: done with get_vars() 7557 1726882101.24680: done getting variables 7557 1726882101.25031: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:28:21 -0400 (0:00:00.106) 0:00:27.103 ****** 7557 1726882101.25065: entering _queue_task() for managed_node3/package 7557 1726882101.25803: worker is 1 (out of 1 available) 7557 1726882101.25814: exiting _queue_task() for managed_node3/package 7557 1726882101.25827: done queuing things up, now waiting for results queue to drain 7557 1726882101.25828: waiting for pending results... 7557 1726882101.26257: running TaskExecutor() for managed_node3/TASK: Install iproute 7557 1726882101.26603: in run() - task 12673a56-9f93-ed48-b3a5-000000000e02 7557 1726882101.26607: variable 'ansible_search_path' from source: unknown 7557 1726882101.26610: variable 'ansible_search_path' from source: unknown 7557 1726882101.26613: calling self._execute() 7557 1726882101.26768: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.26830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.26912: variable 'omit' from source: magic vars 7557 1726882101.27688: variable 'ansible_distribution_major_version' from source: facts 7557 1726882101.27691: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882101.27696: variable 'omit' from source: magic vars 7557 1726882101.27727: variable 'omit' from source: magic vars 7557 1726882101.28085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882101.32501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882101.32870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882101.32916: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882101.32955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882101.32986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882101.33239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882101.33287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882101.33319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882101.33362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882101.33380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882101.33480: variable '__network_is_ostree' from source: set_fact 7557 1726882101.33490: variable 'omit' from source: magic vars 7557 1726882101.33527: variable 'omit' from source: magic vars 7557 1726882101.33561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882101.33597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882101.33620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882101.33642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882101.33655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882101.33688: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882101.33697: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.33705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.33809: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882101.33821: Set connection var ansible_shell_executable to /bin/sh 7557 1726882101.33828: Set connection var ansible_shell_type to sh 7557 1726882101.33839: Set connection var ansible_pipelining to False 7557 1726882101.33845: Set connection var ansible_connection to ssh 7557 1726882101.33857: Set connection var ansible_timeout to 10 7557 1726882101.33882: variable 'ansible_shell_executable' from source: unknown 7557 1726882101.33889: variable 'ansible_connection' from source: unknown 7557 1726882101.33898: variable 'ansible_module_compression' from source: unknown 7557 1726882101.33904: variable 'ansible_shell_type' from source: unknown 7557 1726882101.33911: variable 'ansible_shell_executable' from source: unknown 7557 1726882101.33916: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882101.33923: variable 'ansible_pipelining' from source: unknown 7557 1726882101.33929: variable 'ansible_timeout' from source: unknown 7557 1726882101.33935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882101.34036: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882101.34052: variable 'omit' from source: magic vars 7557 1726882101.34061: starting attempt loop 7557 1726882101.34069: running the handler 7557 1726882101.34079: variable 'ansible_facts' from source: unknown 7557 1726882101.34086: variable 'ansible_facts' from source: unknown 7557 1726882101.34126: _low_level_execute_command(): starting 7557 1726882101.34248: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882101.34801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882101.34817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882101.34831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882101.34912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882101.34949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882101.34975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882101.35001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882101.35223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882101.36903: stdout chunk (state=3): >>>/root <<< 7557 1726882101.37052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882101.37056: stdout chunk (state=3): >>><<< 7557 1726882101.37059: stderr chunk (state=3): >>><<< 7557 1726882101.37077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882101.37107: _low_level_execute_command(): starting 7557 1726882101.37118: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560 `" && echo ansible-tmp-1726882101.3709164-8630-266530649886560="` echo /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560 `" ) && sleep 0' 7557 1726882101.38506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882101.38548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882101.38600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882101.38604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882101.38729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882101.40592: stdout chunk (state=3): >>>ansible-tmp-1726882101.3709164-8630-266530649886560=/root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560 <<< 7557 1726882101.40768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882101.40771: stdout chunk (state=3): >>><<< 7557 1726882101.40774: stderr chunk (state=3): >>><<< 7557 1726882101.40776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882101.3709164-8630-266530649886560=/root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882101.40909: variable 'ansible_module_compression' from source: unknown 7557 1726882101.40963: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7557 1726882101.41204: variable 'ansible_facts' from source: unknown 7557 1726882101.41338: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/AnsiballZ_dnf.py 7557 1726882101.41770: Sending initial data 7557 1726882101.41773: Sent initial data (150 bytes) 7557 1726882101.43118: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882101.43128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882101.43292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882101.43364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882101.43561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882101.45067: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7557 1726882101.45075: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7557 1726882101.45087: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7557 1726882101.45090: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7557 1726882101.45094: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 7557 1726882101.45127: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7557 1726882101.45130: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 7557 1726882101.45207: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882101.45210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882101.45315: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpbvsyd04p /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/AnsiballZ_dnf.py <<< 7557 1726882101.45318: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/AnsiballZ_dnf.py" <<< 7557 1726882101.45437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpbvsyd04p" to remote "/root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/AnsiballZ_dnf.py" <<< 7557 1726882101.46982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882101.46985: stderr chunk (state=3): >>><<< 7557 1726882101.46988: stdout chunk (state=3): >>><<< 7557 1726882101.47220: done transferring module to remote 7557 1726882101.47230: _low_level_execute_command(): starting 7557 1726882101.47235: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/ /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/AnsiballZ_dnf.py && sleep 0' 7557 1726882101.48596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882101.48600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882101.48604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882101.48606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882101.48609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882101.48673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882101.48677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882101.48835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882101.48887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882101.50805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882101.50854: stderr chunk (state=3): >>><<< 7557 1726882101.50857: stdout chunk (state=3): >>><<< 7557 1726882101.50875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882101.50878: _low_level_execute_command(): starting 7557 1726882101.50884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/AnsiballZ_dnf.py && sleep 0' 7557 1726882101.51460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882101.51470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882101.51480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882101.51670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882101.51674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882101.51716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882101.92052: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7557 1726882101.96158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882101.96163: stdout chunk (state=3): >>><<< 7557 1726882101.96170: stderr chunk (state=3): >>><<< 7557 1726882101.96185: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882101.96221: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882101.96229: _low_level_execute_command(): starting 7557 1726882101.96231: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882101.3709164-8630-266530649886560/ > /dev/null 2>&1 && sleep 0' 7557 1726882101.96655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882101.96658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882101.96661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882101.96663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882101.96665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882101.96722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882101.96725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882101.96766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882101.98695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882101.98698: stdout chunk (state=3): >>><<< 7557 1726882101.98700: stderr chunk (state=3): >>><<< 7557 1726882101.98703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882101.98705: handler run complete 7557 1726882101.98782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882101.98981: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882101.99031: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882101.99062: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882101.99147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882101.99181: variable '__install_status' from source: set_fact 7557 1726882101.99203: Evaluated conditional (__install_status is success): True 7557 1726882101.99220: attempt loop complete, returning result 7557 1726882101.99223: _execute() done 7557 1726882101.99226: dumping result to json 7557 1726882101.99229: done dumping result, returning 7557 1726882101.99245: done running TaskExecutor() for managed_node3/TASK: Install iproute [12673a56-9f93-ed48-b3a5-000000000e02] 7557 1726882101.99299: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e02 7557 1726882101.99467: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e02 7557 1726882101.99471: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7557 1726882101.99582: no more pending results, returning what we have 7557 1726882101.99586: results queue empty 7557 1726882101.99587: checking for any_errors_fatal 7557 1726882101.99592: done checking for any_errors_fatal 7557 1726882101.99601: checking for max_fail_percentage 7557 1726882101.99603: done checking for max_fail_percentage 7557 1726882101.99604: checking to see if all hosts have failed and the running result is not ok 7557 1726882101.99605: done checking to see if all hosts have failed 7557 1726882101.99605: getting the remaining hosts for this loop 7557 1726882101.99607: done getting the remaining hosts for this loop 7557 1726882101.99610: getting the next task for host managed_node3 7557 1726882101.99615: done getting next task for host managed_node3 7557 1726882101.99617: ^ task is: TASK: Create veth interface {{ interface }} 7557 1726882101.99621: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882101.99624: getting variables 7557 1726882101.99625: in VariableManager get_vars() 7557 1726882101.99680: Calling all_inventory to load vars for managed_node3 7557 1726882101.99683: Calling groups_inventory to load vars for managed_node3 7557 1726882101.99685: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882101.99698: Calling all_plugins_play to load vars for managed_node3 7557 1726882101.99701: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882101.99704: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.00474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.01423: done with get_vars() 7557 1726882102.01447: done getting variables 7557 1726882102.01513: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882102.01635: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:28:22 -0400 (0:00:00.765) 0:00:27.869 ****** 7557 1726882102.01665: entering _queue_task() for managed_node3/command 7557 1726882102.01967: worker is 1 (out of 1 available) 7557 1726882102.01981: exiting _queue_task() for managed_node3/command 7557 1726882102.01998: done queuing things up, now waiting for results queue to drain 7557 1726882102.02000: waiting for pending results... 7557 1726882102.02295: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7557 1726882102.02382: in run() - task 12673a56-9f93-ed48-b3a5-000000000e03 7557 1726882102.02390: variable 'ansible_search_path' from source: unknown 7557 1726882102.02401: variable 'ansible_search_path' from source: unknown 7557 1726882102.02608: variable 'interface' from source: play vars 7557 1726882102.02670: variable 'interface' from source: play vars 7557 1726882102.02725: variable 'interface' from source: play vars 7557 1726882102.02835: Loaded config def from plugin (lookup/items) 7557 1726882102.02840: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7557 1726882102.02857: variable 'omit' from source: magic vars 7557 1726882102.02960: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.02967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.02976: variable 'omit' from source: magic vars 7557 1726882102.03137: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.03144: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.03271: variable 'type' from source: play vars 7557 1726882102.03275: variable 'state' from source: include params 7557 1726882102.03277: variable 'interface' from source: play vars 7557 1726882102.03282: variable 'current_interfaces' from source: set_fact 7557 1726882102.03288: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7557 1726882102.03291: when evaluation is False, skipping this task 7557 1726882102.03314: variable 'item' from source: unknown 7557 1726882102.03364: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7557 1726882102.03518: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.03521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.03523: variable 'omit' from source: magic vars 7557 1726882102.03576: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.03579: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.03702: variable 'type' from source: play vars 7557 1726882102.03705: variable 'state' from source: include params 7557 1726882102.03709: variable 'interface' from source: play vars 7557 1726882102.03713: variable 'current_interfaces' from source: set_fact 7557 1726882102.03719: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7557 1726882102.03722: when evaluation is False, skipping this task 7557 1726882102.03739: variable 'item' from source: unknown 7557 1726882102.03783: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7557 1726882102.03854: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.03858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.03860: variable 'omit' from source: magic vars 7557 1726882102.03956: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.03959: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.04074: variable 'type' from source: play vars 7557 1726882102.04077: variable 'state' from source: include params 7557 1726882102.04082: variable 'interface' from source: play vars 7557 1726882102.04085: variable 'current_interfaces' from source: set_fact 7557 1726882102.04095: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7557 1726882102.04098: when evaluation is False, skipping this task 7557 1726882102.04114: variable 'item' from source: unknown 7557 1726882102.04154: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7557 1726882102.04229: dumping result to json 7557 1726882102.04232: done dumping result, returning 7557 1726882102.04234: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [12673a56-9f93-ed48-b3a5-000000000e03] 7557 1726882102.04236: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e03 7557 1726882102.04267: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e03 7557 1726882102.04269: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7557 1726882102.04302: no more pending results, returning what we have 7557 1726882102.04306: results queue empty 7557 1726882102.04307: checking for any_errors_fatal 7557 1726882102.04312: done checking for any_errors_fatal 7557 1726882102.04312: checking for max_fail_percentage 7557 1726882102.04314: done checking for max_fail_percentage 7557 1726882102.04315: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.04316: done checking to see if all hosts have failed 7557 1726882102.04316: getting the remaining hosts for this loop 7557 1726882102.04318: done getting the remaining hosts for this loop 7557 1726882102.04321: getting the next task for host managed_node3 7557 1726882102.04328: done getting next task for host managed_node3 7557 1726882102.04330: ^ task is: TASK: Set up veth as managed by NetworkManager 7557 1726882102.04332: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.04337: getting variables 7557 1726882102.04338: in VariableManager get_vars() 7557 1726882102.04383: Calling all_inventory to load vars for managed_node3 7557 1726882102.04386: Calling groups_inventory to load vars for managed_node3 7557 1726882102.04388: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.04401: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.04404: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.04406: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.05267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.06101: done with get_vars() 7557 1726882102.06115: done getting variables 7557 1726882102.06156: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:28:22 -0400 (0:00:00.045) 0:00:27.914 ****** 7557 1726882102.06179: entering _queue_task() for managed_node3/command 7557 1726882102.06383: worker is 1 (out of 1 available) 7557 1726882102.06399: exiting _queue_task() for managed_node3/command 7557 1726882102.06412: done queuing things up, now waiting for results queue to drain 7557 1726882102.06413: waiting for pending results... 7557 1726882102.06580: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7557 1726882102.06649: in run() - task 12673a56-9f93-ed48-b3a5-000000000e04 7557 1726882102.06661: variable 'ansible_search_path' from source: unknown 7557 1726882102.06664: variable 'ansible_search_path' from source: unknown 7557 1726882102.06691: calling self._execute() 7557 1726882102.06770: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.06774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.06782: variable 'omit' from source: magic vars 7557 1726882102.07041: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.07050: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.07154: variable 'type' from source: play vars 7557 1726882102.07158: variable 'state' from source: include params 7557 1726882102.07163: Evaluated conditional (type == 'veth' and state == 'present'): False 7557 1726882102.07166: when evaluation is False, skipping this task 7557 1726882102.07169: _execute() done 7557 1726882102.07171: dumping result to json 7557 1726882102.07173: done dumping result, returning 7557 1726882102.07185: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-ed48-b3a5-000000000e04] 7557 1726882102.07188: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e04 7557 1726882102.07265: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e04 7557 1726882102.07267: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7557 1726882102.07329: no more pending results, returning what we have 7557 1726882102.07332: results queue empty 7557 1726882102.07333: checking for any_errors_fatal 7557 1726882102.07341: done checking for any_errors_fatal 7557 1726882102.07342: checking for max_fail_percentage 7557 1726882102.07344: done checking for max_fail_percentage 7557 1726882102.07344: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.07345: done checking to see if all hosts have failed 7557 1726882102.07346: getting the remaining hosts for this loop 7557 1726882102.07347: done getting the remaining hosts for this loop 7557 1726882102.07350: getting the next task for host managed_node3 7557 1726882102.07355: done getting next task for host managed_node3 7557 1726882102.07357: ^ task is: TASK: Delete veth interface {{ interface }} 7557 1726882102.07359: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.07362: getting variables 7557 1726882102.07363: in VariableManager get_vars() 7557 1726882102.07404: Calling all_inventory to load vars for managed_node3 7557 1726882102.07406: Calling groups_inventory to load vars for managed_node3 7557 1726882102.07408: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.07418: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.07420: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.07423: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.08126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.09043: done with get_vars() 7557 1726882102.09057: done getting variables 7557 1726882102.09100: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882102.09172: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:28:22 -0400 (0:00:00.030) 0:00:27.944 ****** 7557 1726882102.09198: entering _queue_task() for managed_node3/command 7557 1726882102.09384: worker is 1 (out of 1 available) 7557 1726882102.09400: exiting _queue_task() for managed_node3/command 7557 1726882102.09412: done queuing things up, now waiting for results queue to drain 7557 1726882102.09413: waiting for pending results... 7557 1726882102.09576: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7557 1726882102.09651: in run() - task 12673a56-9f93-ed48-b3a5-000000000e05 7557 1726882102.09663: variable 'ansible_search_path' from source: unknown 7557 1726882102.09666: variable 'ansible_search_path' from source: unknown 7557 1726882102.09692: calling self._execute() 7557 1726882102.09766: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.09771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.09779: variable 'omit' from source: magic vars 7557 1726882102.10030: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.10040: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.10167: variable 'type' from source: play vars 7557 1726882102.10174: variable 'state' from source: include params 7557 1726882102.10177: variable 'interface' from source: play vars 7557 1726882102.10180: variable 'current_interfaces' from source: set_fact 7557 1726882102.10190: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7557 1726882102.10198: variable 'omit' from source: magic vars 7557 1726882102.10223: variable 'omit' from source: magic vars 7557 1726882102.10289: variable 'interface' from source: play vars 7557 1726882102.10306: variable 'omit' from source: magic vars 7557 1726882102.10337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882102.10362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882102.10378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882102.10392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882102.10407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882102.10432: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882102.10435: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.10437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.10511: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882102.10515: Set connection var ansible_shell_executable to /bin/sh 7557 1726882102.10518: Set connection var ansible_shell_type to sh 7557 1726882102.10523: Set connection var ansible_pipelining to False 7557 1726882102.10525: Set connection var ansible_connection to ssh 7557 1726882102.10531: Set connection var ansible_timeout to 10 7557 1726882102.10546: variable 'ansible_shell_executable' from source: unknown 7557 1726882102.10548: variable 'ansible_connection' from source: unknown 7557 1726882102.10551: variable 'ansible_module_compression' from source: unknown 7557 1726882102.10553: variable 'ansible_shell_type' from source: unknown 7557 1726882102.10556: variable 'ansible_shell_executable' from source: unknown 7557 1726882102.10558: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.10561: variable 'ansible_pipelining' from source: unknown 7557 1726882102.10564: variable 'ansible_timeout' from source: unknown 7557 1726882102.10568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.10670: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882102.10679: variable 'omit' from source: magic vars 7557 1726882102.10683: starting attempt loop 7557 1726882102.10686: running the handler 7557 1726882102.10706: _low_level_execute_command(): starting 7557 1726882102.10716: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882102.11234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882102.11239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882102.11242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882102.11244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.11299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882102.11302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882102.11304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882102.11359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882102.13001: stdout chunk (state=3): >>>/root <<< 7557 1726882102.13105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882102.13131: stderr chunk (state=3): >>><<< 7557 1726882102.13134: stdout chunk (state=3): >>><<< 7557 1726882102.13152: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882102.13164: _low_level_execute_command(): starting 7557 1726882102.13169: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129 `" && echo ansible-tmp-1726882102.1315181-8666-149908944131129="` echo /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129 `" ) && sleep 0' 7557 1726882102.13570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882102.13580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882102.13612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.13615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882102.13618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882102.13621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.13669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882102.13672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882102.13724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882102.15579: stdout chunk (state=3): >>>ansible-tmp-1726882102.1315181-8666-149908944131129=/root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129 <<< 7557 1726882102.15681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882102.15710: stderr chunk (state=3): >>><<< 7557 1726882102.15713: stdout chunk (state=3): >>><<< 7557 1726882102.15726: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882102.1315181-8666-149908944131129=/root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882102.15750: variable 'ansible_module_compression' from source: unknown 7557 1726882102.15792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882102.15822: variable 'ansible_facts' from source: unknown 7557 1726882102.15880: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/AnsiballZ_command.py 7557 1726882102.15971: Sending initial data 7557 1726882102.15974: Sent initial data (154 bytes) 7557 1726882102.16409: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882102.16414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882102.16416: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.16418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882102.16423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.16466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882102.16470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882102.16521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882102.18051: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7557 1726882102.18054: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882102.18091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882102.18135: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpyagus414 /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/AnsiballZ_command.py <<< 7557 1726882102.18141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/AnsiballZ_command.py" <<< 7557 1726882102.18184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpyagus414" to remote "/root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/AnsiballZ_command.py" <<< 7557 1726882102.18744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882102.18783: stderr chunk (state=3): >>><<< 7557 1726882102.18786: stdout chunk (state=3): >>><<< 7557 1726882102.18830: done transferring module to remote 7557 1726882102.18841: _low_level_execute_command(): starting 7557 1726882102.18844: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/ /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/AnsiballZ_command.py && sleep 0' 7557 1726882102.19292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882102.19302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882102.19308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882102.19311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882102.19313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.19357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882102.19362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882102.19365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882102.19412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882102.21117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882102.21143: stderr chunk (state=3): >>><<< 7557 1726882102.21147: stdout chunk (state=3): >>><<< 7557 1726882102.21161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882102.21163: _low_level_execute_command(): starting 7557 1726882102.21169: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/AnsiballZ_command.py && sleep 0' 7557 1726882102.21624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882102.21627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.21630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882102.21632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.21685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882102.21691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882102.21739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882102.37803: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:28:22.365586", "end": "2024-09-20 21:28:22.375158", "delta": "0:00:00.009572", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882102.40030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882102.40034: stdout chunk (state=3): >>><<< 7557 1726882102.40037: stderr chunk (state=3): >>><<< 7557 1726882102.40178: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:28:22.365586", "end": "2024-09-20 21:28:22.375158", "delta": "0:00:00.009572", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882102.40183: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882102.40186: _low_level_execute_command(): starting 7557 1726882102.40189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882102.1315181-8666-149908944131129/ > /dev/null 2>&1 && sleep 0' 7557 1726882102.40767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882102.40784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882102.40869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882102.40914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882102.40939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882102.40952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882102.41031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882102.42880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882102.42888: stdout chunk (state=3): >>><<< 7557 1726882102.42902: stderr chunk (state=3): >>><<< 7557 1726882102.42919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882102.42929: handler run complete 7557 1726882102.43102: Evaluated conditional (False): False 7557 1726882102.43105: attempt loop complete, returning result 7557 1726882102.43107: _execute() done 7557 1726882102.43109: dumping result to json 7557 1726882102.43110: done dumping result, returning 7557 1726882102.43112: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [12673a56-9f93-ed48-b3a5-000000000e05] 7557 1726882102.43114: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e05 7557 1726882102.43180: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e05 7557 1726882102.43183: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.009572", "end": "2024-09-20 21:28:22.375158", "rc": 0, "start": "2024-09-20 21:28:22.365586" } 7557 1726882102.43250: no more pending results, returning what we have 7557 1726882102.43254: results queue empty 7557 1726882102.43255: checking for any_errors_fatal 7557 1726882102.43263: done checking for any_errors_fatal 7557 1726882102.43264: checking for max_fail_percentage 7557 1726882102.43266: done checking for max_fail_percentage 7557 1726882102.43266: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.43267: done checking to see if all hosts have failed 7557 1726882102.43268: getting the remaining hosts for this loop 7557 1726882102.43269: done getting the remaining hosts for this loop 7557 1726882102.43272: getting the next task for host managed_node3 7557 1726882102.43279: done getting next task for host managed_node3 7557 1726882102.43281: ^ task is: TASK: Create dummy interface {{ interface }} 7557 1726882102.43285: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.43289: getting variables 7557 1726882102.43290: in VariableManager get_vars() 7557 1726882102.43344: Calling all_inventory to load vars for managed_node3 7557 1726882102.43347: Calling groups_inventory to load vars for managed_node3 7557 1726882102.43350: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.43360: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.43362: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.43365: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.45043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.46686: done with get_vars() 7557 1726882102.46714: done getting variables 7557 1726882102.46780: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882102.46901: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:28:22 -0400 (0:00:00.377) 0:00:28.322 ****** 7557 1726882102.46937: entering _queue_task() for managed_node3/command 7557 1726882102.47605: worker is 1 (out of 1 available) 7557 1726882102.47615: exiting _queue_task() for managed_node3/command 7557 1726882102.47628: done queuing things up, now waiting for results queue to drain 7557 1726882102.47634: waiting for pending results... 7557 1726882102.47809: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7557 1726882102.47813: in run() - task 12673a56-9f93-ed48-b3a5-000000000e06 7557 1726882102.47816: variable 'ansible_search_path' from source: unknown 7557 1726882102.47819: variable 'ansible_search_path' from source: unknown 7557 1726882102.47835: calling self._execute() 7557 1726882102.47938: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.47950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.47963: variable 'omit' from source: magic vars 7557 1726882102.48306: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.48322: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.48517: variable 'type' from source: play vars 7557 1726882102.48527: variable 'state' from source: include params 7557 1726882102.48535: variable 'interface' from source: play vars 7557 1726882102.48542: variable 'current_interfaces' from source: set_fact 7557 1726882102.48554: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7557 1726882102.48561: when evaluation is False, skipping this task 7557 1726882102.48567: _execute() done 7557 1726882102.48573: dumping result to json 7557 1726882102.48579: done dumping result, returning 7557 1726882102.48590: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [12673a56-9f93-ed48-b3a5-000000000e06] 7557 1726882102.48801: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e06 7557 1726882102.48868: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e06 7557 1726882102.48872: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882102.48914: no more pending results, returning what we have 7557 1726882102.48917: results queue empty 7557 1726882102.48918: checking for any_errors_fatal 7557 1726882102.48926: done checking for any_errors_fatal 7557 1726882102.48926: checking for max_fail_percentage 7557 1726882102.48928: done checking for max_fail_percentage 7557 1726882102.48929: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.48929: done checking to see if all hosts have failed 7557 1726882102.48930: getting the remaining hosts for this loop 7557 1726882102.48931: done getting the remaining hosts for this loop 7557 1726882102.48934: getting the next task for host managed_node3 7557 1726882102.48941: done getting next task for host managed_node3 7557 1726882102.48943: ^ task is: TASK: Delete dummy interface {{ interface }} 7557 1726882102.48946: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.48950: getting variables 7557 1726882102.48951: in VariableManager get_vars() 7557 1726882102.48999: Calling all_inventory to load vars for managed_node3 7557 1726882102.49002: Calling groups_inventory to load vars for managed_node3 7557 1726882102.49005: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.49014: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.49016: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.49019: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.50477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.52046: done with get_vars() 7557 1726882102.52067: done getting variables 7557 1726882102.52129: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882102.52242: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:28:22 -0400 (0:00:00.053) 0:00:28.375 ****** 7557 1726882102.52279: entering _queue_task() for managed_node3/command 7557 1726882102.52709: worker is 1 (out of 1 available) 7557 1726882102.52720: exiting _queue_task() for managed_node3/command 7557 1726882102.52731: done queuing things up, now waiting for results queue to drain 7557 1726882102.52733: waiting for pending results... 7557 1726882102.52918: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7557 1726882102.53027: in run() - task 12673a56-9f93-ed48-b3a5-000000000e07 7557 1726882102.53040: variable 'ansible_search_path' from source: unknown 7557 1726882102.53044: variable 'ansible_search_path' from source: unknown 7557 1726882102.53085: calling self._execute() 7557 1726882102.53202: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.53206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.53215: variable 'omit' from source: magic vars 7557 1726882102.53590: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.53605: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.53807: variable 'type' from source: play vars 7557 1726882102.53813: variable 'state' from source: include params 7557 1726882102.53816: variable 'interface' from source: play vars 7557 1726882102.53821: variable 'current_interfaces' from source: set_fact 7557 1726882102.53834: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7557 1726882102.53838: when evaluation is False, skipping this task 7557 1726882102.53841: _execute() done 7557 1726882102.53843: dumping result to json 7557 1726882102.53845: done dumping result, returning 7557 1726882102.53907: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [12673a56-9f93-ed48-b3a5-000000000e07] 7557 1726882102.53910: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e07 7557 1726882102.53962: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e07 7557 1726882102.53965: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882102.54029: no more pending results, returning what we have 7557 1726882102.54034: results queue empty 7557 1726882102.54035: checking for any_errors_fatal 7557 1726882102.54043: done checking for any_errors_fatal 7557 1726882102.54043: checking for max_fail_percentage 7557 1726882102.54045: done checking for max_fail_percentage 7557 1726882102.54046: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.54047: done checking to see if all hosts have failed 7557 1726882102.54047: getting the remaining hosts for this loop 7557 1726882102.54049: done getting the remaining hosts for this loop 7557 1726882102.54052: getting the next task for host managed_node3 7557 1726882102.54058: done getting next task for host managed_node3 7557 1726882102.54060: ^ task is: TASK: Create tap interface {{ interface }} 7557 1726882102.54064: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.54069: getting variables 7557 1726882102.54070: in VariableManager get_vars() 7557 1726882102.54123: Calling all_inventory to load vars for managed_node3 7557 1726882102.54126: Calling groups_inventory to load vars for managed_node3 7557 1726882102.54128: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.54140: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.54143: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.54146: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.55588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.57153: done with get_vars() 7557 1726882102.57185: done getting variables 7557 1726882102.57251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882102.57365: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:28:22 -0400 (0:00:00.051) 0:00:28.427 ****** 7557 1726882102.57406: entering _queue_task() for managed_node3/command 7557 1726882102.57911: worker is 1 (out of 1 available) 7557 1726882102.57922: exiting _queue_task() for managed_node3/command 7557 1726882102.57933: done queuing things up, now waiting for results queue to drain 7557 1726882102.57934: waiting for pending results... 7557 1726882102.58113: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7557 1726882102.58147: in run() - task 12673a56-9f93-ed48-b3a5-000000000e08 7557 1726882102.58168: variable 'ansible_search_path' from source: unknown 7557 1726882102.58172: variable 'ansible_search_path' from source: unknown 7557 1726882102.58209: calling self._execute() 7557 1726882102.58399: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.58403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.58406: variable 'omit' from source: magic vars 7557 1726882102.58687: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.58706: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.58921: variable 'type' from source: play vars 7557 1726882102.58931: variable 'state' from source: include params 7557 1726882102.58934: variable 'interface' from source: play vars 7557 1726882102.58937: variable 'current_interfaces' from source: set_fact 7557 1726882102.58948: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7557 1726882102.58951: when evaluation is False, skipping this task 7557 1726882102.58954: _execute() done 7557 1726882102.58956: dumping result to json 7557 1726882102.58959: done dumping result, returning 7557 1726882102.58965: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [12673a56-9f93-ed48-b3a5-000000000e08] 7557 1726882102.58971: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e08 7557 1726882102.59056: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e08 7557 1726882102.59060: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882102.59113: no more pending results, returning what we have 7557 1726882102.59118: results queue empty 7557 1726882102.59119: checking for any_errors_fatal 7557 1726882102.59126: done checking for any_errors_fatal 7557 1726882102.59126: checking for max_fail_percentage 7557 1726882102.59128: done checking for max_fail_percentage 7557 1726882102.59129: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.59130: done checking to see if all hosts have failed 7557 1726882102.59131: getting the remaining hosts for this loop 7557 1726882102.59132: done getting the remaining hosts for this loop 7557 1726882102.59137: getting the next task for host managed_node3 7557 1726882102.59144: done getting next task for host managed_node3 7557 1726882102.59147: ^ task is: TASK: Delete tap interface {{ interface }} 7557 1726882102.59151: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.59155: getting variables 7557 1726882102.59156: in VariableManager get_vars() 7557 1726882102.59213: Calling all_inventory to load vars for managed_node3 7557 1726882102.59216: Calling groups_inventory to load vars for managed_node3 7557 1726882102.59219: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.59232: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.59236: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.59239: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.65662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.67262: done with get_vars() 7557 1726882102.67285: done getting variables 7557 1726882102.67336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882102.67433: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:28:22 -0400 (0:00:00.100) 0:00:28.527 ****** 7557 1726882102.67458: entering _queue_task() for managed_node3/command 7557 1726882102.67802: worker is 1 (out of 1 available) 7557 1726882102.67816: exiting _queue_task() for managed_node3/command 7557 1726882102.67830: done queuing things up, now waiting for results queue to drain 7557 1726882102.67831: waiting for pending results... 7557 1726882102.68212: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7557 1726882102.68222: in run() - task 12673a56-9f93-ed48-b3a5-000000000e09 7557 1726882102.68227: variable 'ansible_search_path' from source: unknown 7557 1726882102.68230: variable 'ansible_search_path' from source: unknown 7557 1726882102.68249: calling self._execute() 7557 1726882102.68358: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.68370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.68382: variable 'omit' from source: magic vars 7557 1726882102.68734: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.68752: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.68938: variable 'type' from source: play vars 7557 1726882102.68948: variable 'state' from source: include params 7557 1726882102.68955: variable 'interface' from source: play vars 7557 1726882102.68961: variable 'current_interfaces' from source: set_fact 7557 1726882102.68974: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7557 1726882102.68980: when evaluation is False, skipping this task 7557 1726882102.68987: _execute() done 7557 1726882102.68994: dumping result to json 7557 1726882102.69003: done dumping result, returning 7557 1726882102.69011: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [12673a56-9f93-ed48-b3a5-000000000e09] 7557 1726882102.69019: sending task result for task 12673a56-9f93-ed48-b3a5-000000000e09 7557 1726882102.69355: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000e09 7557 1726882102.69359: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882102.69403: no more pending results, returning what we have 7557 1726882102.69446: results queue empty 7557 1726882102.69447: checking for any_errors_fatal 7557 1726882102.69452: done checking for any_errors_fatal 7557 1726882102.69453: checking for max_fail_percentage 7557 1726882102.69455: done checking for max_fail_percentage 7557 1726882102.69456: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.69457: done checking to see if all hosts have failed 7557 1726882102.69457: getting the remaining hosts for this loop 7557 1726882102.69459: done getting the remaining hosts for this loop 7557 1726882102.69462: getting the next task for host managed_node3 7557 1726882102.69469: done getting next task for host managed_node3 7557 1726882102.69472: ^ task is: TASK: TEST: I can configure an interface with auto_gateway disabled 7557 1726882102.69474: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.69478: getting variables 7557 1726882102.69479: in VariableManager get_vars() 7557 1726882102.69536: Calling all_inventory to load vars for managed_node3 7557 1726882102.69539: Calling groups_inventory to load vars for managed_node3 7557 1726882102.69541: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.69551: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.69553: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.69556: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.70943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.72600: done with get_vars() 7557 1726882102.72627: done getting variables 7557 1726882102.72691: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway disabled] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:83 Friday 20 September 2024 21:28:22 -0400 (0:00:00.052) 0:00:28.580 ****** 7557 1726882102.72725: entering _queue_task() for managed_node3/debug 7557 1726882102.73066: worker is 1 (out of 1 available) 7557 1726882102.73081: exiting _queue_task() for managed_node3/debug 7557 1726882102.73101: done queuing things up, now waiting for results queue to drain 7557 1726882102.73102: waiting for pending results... 7557 1726882102.73366: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled 7557 1726882102.73480: in run() - task 12673a56-9f93-ed48-b3a5-0000000000af 7557 1726882102.73505: variable 'ansible_search_path' from source: unknown 7557 1726882102.73545: calling self._execute() 7557 1726882102.73676: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.73689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.73705: variable 'omit' from source: magic vars 7557 1726882102.74092: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.74123: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.74134: variable 'omit' from source: magic vars 7557 1726882102.74166: variable 'omit' from source: magic vars 7557 1726882102.74209: variable 'omit' from source: magic vars 7557 1726882102.74253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882102.74302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882102.74327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882102.74350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882102.74376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882102.74421: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882102.74430: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.74453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.74798: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882102.74801: Set connection var ansible_shell_executable to /bin/sh 7557 1726882102.74804: Set connection var ansible_shell_type to sh 7557 1726882102.74806: Set connection var ansible_pipelining to False 7557 1726882102.74820: Set connection var ansible_connection to ssh 7557 1726882102.74823: Set connection var ansible_timeout to 10 7557 1726882102.74825: variable 'ansible_shell_executable' from source: unknown 7557 1726882102.74827: variable 'ansible_connection' from source: unknown 7557 1726882102.74829: variable 'ansible_module_compression' from source: unknown 7557 1726882102.74832: variable 'ansible_shell_type' from source: unknown 7557 1726882102.74834: variable 'ansible_shell_executable' from source: unknown 7557 1726882102.74836: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.74838: variable 'ansible_pipelining' from source: unknown 7557 1726882102.74840: variable 'ansible_timeout' from source: unknown 7557 1726882102.74842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.74845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882102.74847: variable 'omit' from source: magic vars 7557 1726882102.74850: starting attempt loop 7557 1726882102.74852: running the handler 7557 1726882102.74877: handler run complete 7557 1726882102.74903: attempt loop complete, returning result 7557 1726882102.74911: _execute() done 7557 1726882102.74918: dumping result to json 7557 1726882102.74925: done dumping result, returning 7557 1726882102.74936: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled [12673a56-9f93-ed48-b3a5-0000000000af] 7557 1726882102.74946: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000af ok: [managed_node3] => {} MSG: ################################################## 7557 1726882102.75084: no more pending results, returning what we have 7557 1726882102.75088: results queue empty 7557 1726882102.75089: checking for any_errors_fatal 7557 1726882102.75097: done checking for any_errors_fatal 7557 1726882102.75097: checking for max_fail_percentage 7557 1726882102.75099: done checking for max_fail_percentage 7557 1726882102.75100: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.75101: done checking to see if all hosts have failed 7557 1726882102.75101: getting the remaining hosts for this loop 7557 1726882102.75103: done getting the remaining hosts for this loop 7557 1726882102.75106: getting the next task for host managed_node3 7557 1726882102.75111: done getting next task for host managed_node3 7557 1726882102.75114: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7557 1726882102.75115: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.75118: getting variables 7557 1726882102.75120: in VariableManager get_vars() 7557 1726882102.75368: Calling all_inventory to load vars for managed_node3 7557 1726882102.75372: Calling groups_inventory to load vars for managed_node3 7557 1726882102.75375: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.75382: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000af 7557 1726882102.75385: WORKER PROCESS EXITING 7557 1726882102.75396: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.75399: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.75403: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.76767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.78311: done with get_vars() 7557 1726882102.78332: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:87 Friday 20 September 2024 21:28:22 -0400 (0:00:00.056) 0:00:28.637 ****** 7557 1726882102.78422: entering _queue_task() for managed_node3/include_tasks 7557 1726882102.78821: worker is 1 (out of 1 available) 7557 1726882102.78833: exiting _queue_task() for managed_node3/include_tasks 7557 1726882102.78844: done queuing things up, now waiting for results queue to drain 7557 1726882102.78845: waiting for pending results... 7557 1726882102.79251: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7557 1726882102.79320: in run() - task 12673a56-9f93-ed48-b3a5-0000000000b0 7557 1726882102.79343: variable 'ansible_search_path' from source: unknown 7557 1726882102.79388: calling self._execute() 7557 1726882102.79511: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.79570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.79573: variable 'omit' from source: magic vars 7557 1726882102.79942: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.79960: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.79973: _execute() done 7557 1726882102.79983: dumping result to json 7557 1726882102.79992: done dumping result, returning 7557 1726882102.80012: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-ed48-b3a5-0000000000b0] 7557 1726882102.80099: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000b0 7557 1726882102.80178: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000b0 7557 1726882102.80182: WORKER PROCESS EXITING 7557 1726882102.80214: no more pending results, returning what we have 7557 1726882102.80219: in VariableManager get_vars() 7557 1726882102.80280: Calling all_inventory to load vars for managed_node3 7557 1726882102.80283: Calling groups_inventory to load vars for managed_node3 7557 1726882102.80288: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.80305: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.80309: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.80313: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.81824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.85190: done with get_vars() 7557 1726882102.85220: variable 'ansible_search_path' from source: unknown 7557 1726882102.85237: we have included files to process 7557 1726882102.85238: generating all_blocks data 7557 1726882102.85240: done generating all_blocks data 7557 1726882102.85245: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882102.85246: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882102.85249: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882102.85668: in VariableManager get_vars() 7557 1726882102.85699: done with get_vars() 7557 1726882102.86368: done processing included file 7557 1726882102.86371: iterating over new_blocks loaded from include file 7557 1726882102.86372: in VariableManager get_vars() 7557 1726882102.86422: done with get_vars() 7557 1726882102.86425: filtering new block on tags 7557 1726882102.86463: done filtering new block on tags 7557 1726882102.86466: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7557 1726882102.86472: extending task lists for all hosts with included blocks 7557 1726882102.90005: done extending task lists 7557 1726882102.90007: done processing included files 7557 1726882102.90008: results queue empty 7557 1726882102.90008: checking for any_errors_fatal 7557 1726882102.90013: done checking for any_errors_fatal 7557 1726882102.90014: checking for max_fail_percentage 7557 1726882102.90015: done checking for max_fail_percentage 7557 1726882102.90015: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.90016: done checking to see if all hosts have failed 7557 1726882102.90017: getting the remaining hosts for this loop 7557 1726882102.90018: done getting the remaining hosts for this loop 7557 1726882102.90021: getting the next task for host managed_node3 7557 1726882102.90026: done getting next task for host managed_node3 7557 1726882102.90028: ^ task is: TASK: Ensure state in ["present", "absent"] 7557 1726882102.90031: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.90033: getting variables 7557 1726882102.90034: in VariableManager get_vars() 7557 1726882102.90055: Calling all_inventory to load vars for managed_node3 7557 1726882102.90058: Calling groups_inventory to load vars for managed_node3 7557 1726882102.90060: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.90066: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.90069: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.90072: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.91217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.92226: done with get_vars() 7557 1726882102.92242: done getting variables 7557 1726882102.92276: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:28:22 -0400 (0:00:00.138) 0:00:28.775 ****** 7557 1726882102.92301: entering _queue_task() for managed_node3/fail 7557 1726882102.92553: worker is 1 (out of 1 available) 7557 1726882102.92565: exiting _queue_task() for managed_node3/fail 7557 1726882102.92579: done queuing things up, now waiting for results queue to drain 7557 1726882102.92580: waiting for pending results... 7557 1726882102.92759: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7557 1726882102.92838: in run() - task 12673a56-9f93-ed48-b3a5-0000000010aa 7557 1726882102.92852: variable 'ansible_search_path' from source: unknown 7557 1726882102.92890: variable 'ansible_search_path' from source: unknown 7557 1726882102.92921: calling self._execute() 7557 1726882102.93029: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.93035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.93045: variable 'omit' from source: magic vars 7557 1726882102.93434: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.93437: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.93550: variable 'state' from source: include params 7557 1726882102.93554: Evaluated conditional (state not in ["present", "absent"]): False 7557 1726882102.93557: when evaluation is False, skipping this task 7557 1726882102.93561: _execute() done 7557 1726882102.93564: dumping result to json 7557 1726882102.93567: done dumping result, returning 7557 1726882102.93699: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-ed48-b3a5-0000000010aa] 7557 1726882102.93702: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010aa 7557 1726882102.93763: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010aa 7557 1726882102.93765: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7557 1726882102.93813: no more pending results, returning what we have 7557 1726882102.93816: results queue empty 7557 1726882102.93817: checking for any_errors_fatal 7557 1726882102.93818: done checking for any_errors_fatal 7557 1726882102.93819: checking for max_fail_percentage 7557 1726882102.93820: done checking for max_fail_percentage 7557 1726882102.93821: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.93822: done checking to see if all hosts have failed 7557 1726882102.93822: getting the remaining hosts for this loop 7557 1726882102.93824: done getting the remaining hosts for this loop 7557 1726882102.93827: getting the next task for host managed_node3 7557 1726882102.93831: done getting next task for host managed_node3 7557 1726882102.93833: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882102.93836: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.93840: getting variables 7557 1726882102.93841: in VariableManager get_vars() 7557 1726882102.93883: Calling all_inventory to load vars for managed_node3 7557 1726882102.93885: Calling groups_inventory to load vars for managed_node3 7557 1726882102.93888: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.93917: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.93920: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.93924: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.95107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.96130: done with get_vars() 7557 1726882102.96151: done getting variables 7557 1726882102.96200: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:28:22 -0400 (0:00:00.039) 0:00:28.815 ****** 7557 1726882102.96222: entering _queue_task() for managed_node3/fail 7557 1726882102.96471: worker is 1 (out of 1 available) 7557 1726882102.96484: exiting _queue_task() for managed_node3/fail 7557 1726882102.96500: done queuing things up, now waiting for results queue to drain 7557 1726882102.96501: waiting for pending results... 7557 1726882102.96680: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882102.96749: in run() - task 12673a56-9f93-ed48-b3a5-0000000010ab 7557 1726882102.96760: variable 'ansible_search_path' from source: unknown 7557 1726882102.96764: variable 'ansible_search_path' from source: unknown 7557 1726882102.96792: calling self._execute() 7557 1726882102.96886: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882102.96890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882102.96905: variable 'omit' from source: magic vars 7557 1726882102.97302: variable 'ansible_distribution_major_version' from source: facts 7557 1726882102.97306: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882102.97517: variable 'type' from source: play vars 7557 1726882102.97520: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7557 1726882102.97522: when evaluation is False, skipping this task 7557 1726882102.97524: _execute() done 7557 1726882102.97526: dumping result to json 7557 1726882102.97528: done dumping result, returning 7557 1726882102.97530: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-ed48-b3a5-0000000010ab] 7557 1726882102.97531: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ab 7557 1726882102.97589: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ab 7557 1726882102.97591: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7557 1726882102.97661: no more pending results, returning what we have 7557 1726882102.97665: results queue empty 7557 1726882102.97666: checking for any_errors_fatal 7557 1726882102.97673: done checking for any_errors_fatal 7557 1726882102.97674: checking for max_fail_percentage 7557 1726882102.97675: done checking for max_fail_percentage 7557 1726882102.97676: checking to see if all hosts have failed and the running result is not ok 7557 1726882102.97677: done checking to see if all hosts have failed 7557 1726882102.97678: getting the remaining hosts for this loop 7557 1726882102.97679: done getting the remaining hosts for this loop 7557 1726882102.97682: getting the next task for host managed_node3 7557 1726882102.97687: done getting next task for host managed_node3 7557 1726882102.97689: ^ task is: TASK: Include the task 'show_interfaces.yml' 7557 1726882102.97692: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882102.97698: getting variables 7557 1726882102.97699: in VariableManager get_vars() 7557 1726882102.97804: Calling all_inventory to load vars for managed_node3 7557 1726882102.97807: Calling groups_inventory to load vars for managed_node3 7557 1726882102.97810: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882102.97819: Calling all_plugins_play to load vars for managed_node3 7557 1726882102.97822: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882102.97824: Calling groups_plugins_play to load vars for managed_node3 7557 1726882102.98977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882102.99821: done with get_vars() 7557 1726882102.99837: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:28:22 -0400 (0:00:00.036) 0:00:28.851 ****** 7557 1726882102.99904: entering _queue_task() for managed_node3/include_tasks 7557 1726882103.00134: worker is 1 (out of 1 available) 7557 1726882103.00146: exiting _queue_task() for managed_node3/include_tasks 7557 1726882103.00159: done queuing things up, now waiting for results queue to drain 7557 1726882103.00161: waiting for pending results... 7557 1726882103.00336: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7557 1726882103.00407: in run() - task 12673a56-9f93-ed48-b3a5-0000000010ac 7557 1726882103.00419: variable 'ansible_search_path' from source: unknown 7557 1726882103.00424: variable 'ansible_search_path' from source: unknown 7557 1726882103.00453: calling self._execute() 7557 1726882103.00534: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.00539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.00548: variable 'omit' from source: magic vars 7557 1726882103.00830: variable 'ansible_distribution_major_version' from source: facts 7557 1726882103.00839: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882103.00846: _execute() done 7557 1726882103.00849: dumping result to json 7557 1726882103.00851: done dumping result, returning 7557 1726882103.00858: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-ed48-b3a5-0000000010ac] 7557 1726882103.00862: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ac 7557 1726882103.00948: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ac 7557 1726882103.00951: WORKER PROCESS EXITING 7557 1726882103.00977: no more pending results, returning what we have 7557 1726882103.00982: in VariableManager get_vars() 7557 1726882103.01039: Calling all_inventory to load vars for managed_node3 7557 1726882103.01041: Calling groups_inventory to load vars for managed_node3 7557 1726882103.01044: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882103.01055: Calling all_plugins_play to load vars for managed_node3 7557 1726882103.01058: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882103.01060: Calling groups_plugins_play to load vars for managed_node3 7557 1726882103.01820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882103.02673: done with get_vars() 7557 1726882103.02688: variable 'ansible_search_path' from source: unknown 7557 1726882103.02689: variable 'ansible_search_path' from source: unknown 7557 1726882103.02716: we have included files to process 7557 1726882103.02717: generating all_blocks data 7557 1726882103.02718: done generating all_blocks data 7557 1726882103.02721: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882103.02722: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882103.02723: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882103.02792: in VariableManager get_vars() 7557 1726882103.02814: done with get_vars() 7557 1726882103.02887: done processing included file 7557 1726882103.02889: iterating over new_blocks loaded from include file 7557 1726882103.02890: in VariableManager get_vars() 7557 1726882103.02909: done with get_vars() 7557 1726882103.02910: filtering new block on tags 7557 1726882103.02921: done filtering new block on tags 7557 1726882103.02923: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7557 1726882103.02926: extending task lists for all hosts with included blocks 7557 1726882103.03147: done extending task lists 7557 1726882103.03148: done processing included files 7557 1726882103.03149: results queue empty 7557 1726882103.03149: checking for any_errors_fatal 7557 1726882103.03151: done checking for any_errors_fatal 7557 1726882103.03151: checking for max_fail_percentage 7557 1726882103.03152: done checking for max_fail_percentage 7557 1726882103.03152: checking to see if all hosts have failed and the running result is not ok 7557 1726882103.03153: done checking to see if all hosts have failed 7557 1726882103.03153: getting the remaining hosts for this loop 7557 1726882103.03154: done getting the remaining hosts for this loop 7557 1726882103.03156: getting the next task for host managed_node3 7557 1726882103.03158: done getting next task for host managed_node3 7557 1726882103.03160: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7557 1726882103.03162: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882103.03164: getting variables 7557 1726882103.03164: in VariableManager get_vars() 7557 1726882103.03175: Calling all_inventory to load vars for managed_node3 7557 1726882103.03177: Calling groups_inventory to load vars for managed_node3 7557 1726882103.03178: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882103.03182: Calling all_plugins_play to load vars for managed_node3 7557 1726882103.03183: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882103.03185: Calling groups_plugins_play to load vars for managed_node3 7557 1726882103.03870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882103.04707: done with get_vars() 7557 1726882103.04722: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:28:23 -0400 (0:00:00.048) 0:00:28.900 ****** 7557 1726882103.04776: entering _queue_task() for managed_node3/include_tasks 7557 1726882103.05019: worker is 1 (out of 1 available) 7557 1726882103.05033: exiting _queue_task() for managed_node3/include_tasks 7557 1726882103.05047: done queuing things up, now waiting for results queue to drain 7557 1726882103.05049: waiting for pending results... 7557 1726882103.05231: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7557 1726882103.05308: in run() - task 12673a56-9f93-ed48-b3a5-00000000130a 7557 1726882103.05319: variable 'ansible_search_path' from source: unknown 7557 1726882103.05322: variable 'ansible_search_path' from source: unknown 7557 1726882103.05351: calling self._execute() 7557 1726882103.05431: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.05435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.05445: variable 'omit' from source: magic vars 7557 1726882103.05720: variable 'ansible_distribution_major_version' from source: facts 7557 1726882103.05728: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882103.05734: _execute() done 7557 1726882103.05738: dumping result to json 7557 1726882103.05740: done dumping result, returning 7557 1726882103.05747: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-ed48-b3a5-00000000130a] 7557 1726882103.05751: sending task result for task 12673a56-9f93-ed48-b3a5-00000000130a 7557 1726882103.05836: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000130a 7557 1726882103.05838: WORKER PROCESS EXITING 7557 1726882103.05863: no more pending results, returning what we have 7557 1726882103.05868: in VariableManager get_vars() 7557 1726882103.05924: Calling all_inventory to load vars for managed_node3 7557 1726882103.05927: Calling groups_inventory to load vars for managed_node3 7557 1726882103.05929: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882103.05940: Calling all_plugins_play to load vars for managed_node3 7557 1726882103.05943: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882103.05946: Calling groups_plugins_play to load vars for managed_node3 7557 1726882103.06715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882103.07569: done with get_vars() 7557 1726882103.07582: variable 'ansible_search_path' from source: unknown 7557 1726882103.07583: variable 'ansible_search_path' from source: unknown 7557 1726882103.07625: we have included files to process 7557 1726882103.07626: generating all_blocks data 7557 1726882103.07628: done generating all_blocks data 7557 1726882103.07628: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882103.07629: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882103.07630: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882103.07804: done processing included file 7557 1726882103.07806: iterating over new_blocks loaded from include file 7557 1726882103.07807: in VariableManager get_vars() 7557 1726882103.07823: done with get_vars() 7557 1726882103.07824: filtering new block on tags 7557 1726882103.07836: done filtering new block on tags 7557 1726882103.07838: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7557 1726882103.07843: extending task lists for all hosts with included blocks 7557 1726882103.07930: done extending task lists 7557 1726882103.07931: done processing included files 7557 1726882103.07932: results queue empty 7557 1726882103.07932: checking for any_errors_fatal 7557 1726882103.07935: done checking for any_errors_fatal 7557 1726882103.07936: checking for max_fail_percentage 7557 1726882103.07936: done checking for max_fail_percentage 7557 1726882103.07937: checking to see if all hosts have failed and the running result is not ok 7557 1726882103.07937: done checking to see if all hosts have failed 7557 1726882103.07938: getting the remaining hosts for this loop 7557 1726882103.07938: done getting the remaining hosts for this loop 7557 1726882103.07940: getting the next task for host managed_node3 7557 1726882103.07943: done getting next task for host managed_node3 7557 1726882103.07944: ^ task is: TASK: Gather current interface info 7557 1726882103.07947: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882103.07949: getting variables 7557 1726882103.07950: in VariableManager get_vars() 7557 1726882103.07962: Calling all_inventory to load vars for managed_node3 7557 1726882103.07963: Calling groups_inventory to load vars for managed_node3 7557 1726882103.07965: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882103.07968: Calling all_plugins_play to load vars for managed_node3 7557 1726882103.07970: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882103.07971: Calling groups_plugins_play to load vars for managed_node3 7557 1726882103.08687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882103.09529: done with get_vars() 7557 1726882103.09544: done getting variables 7557 1726882103.09574: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:28:23 -0400 (0:00:00.048) 0:00:28.948 ****** 7557 1726882103.09602: entering _queue_task() for managed_node3/command 7557 1726882103.09855: worker is 1 (out of 1 available) 7557 1726882103.09869: exiting _queue_task() for managed_node3/command 7557 1726882103.09882: done queuing things up, now waiting for results queue to drain 7557 1726882103.09883: waiting for pending results... 7557 1726882103.10065: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7557 1726882103.10142: in run() - task 12673a56-9f93-ed48-b3a5-000000001341 7557 1726882103.10153: variable 'ansible_search_path' from source: unknown 7557 1726882103.10157: variable 'ansible_search_path' from source: unknown 7557 1726882103.10185: calling self._execute() 7557 1726882103.10266: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.10270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.10278: variable 'omit' from source: magic vars 7557 1726882103.10561: variable 'ansible_distribution_major_version' from source: facts 7557 1726882103.10569: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882103.10574: variable 'omit' from source: magic vars 7557 1726882103.10611: variable 'omit' from source: magic vars 7557 1726882103.10636: variable 'omit' from source: magic vars 7557 1726882103.10670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882103.10700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882103.10714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882103.10727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.10738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.10765: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882103.10769: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.10772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.10843: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882103.10848: Set connection var ansible_shell_executable to /bin/sh 7557 1726882103.10851: Set connection var ansible_shell_type to sh 7557 1726882103.10856: Set connection var ansible_pipelining to False 7557 1726882103.10859: Set connection var ansible_connection to ssh 7557 1726882103.10864: Set connection var ansible_timeout to 10 7557 1726882103.10883: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.10886: variable 'ansible_connection' from source: unknown 7557 1726882103.10889: variable 'ansible_module_compression' from source: unknown 7557 1726882103.10891: variable 'ansible_shell_type' from source: unknown 7557 1726882103.10898: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.10900: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.10903: variable 'ansible_pipelining' from source: unknown 7557 1726882103.10905: variable 'ansible_timeout' from source: unknown 7557 1726882103.10907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.11007: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882103.11019: variable 'omit' from source: magic vars 7557 1726882103.11022: starting attempt loop 7557 1726882103.11024: running the handler 7557 1726882103.11040: _low_level_execute_command(): starting 7557 1726882103.11047: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882103.11567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882103.11572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.11575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882103.11578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.11637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.11640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882103.11648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.11706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.13360: stdout chunk (state=3): >>>/root <<< 7557 1726882103.13454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.13484: stderr chunk (state=3): >>><<< 7557 1726882103.13488: stdout chunk (state=3): >>><<< 7557 1726882103.13517: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882103.13530: _low_level_execute_command(): starting 7557 1726882103.13536: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761 `" && echo ansible-tmp-1726882103.1351714-8702-186295646072761="` echo /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761 `" ) && sleep 0' 7557 1726882103.13962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.13969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882103.13999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.14011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.14015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882103.14024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.14060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.14063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.14117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.15982: stdout chunk (state=3): >>>ansible-tmp-1726882103.1351714-8702-186295646072761=/root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761 <<< 7557 1726882103.16084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.16119: stderr chunk (state=3): >>><<< 7557 1726882103.16122: stdout chunk (state=3): >>><<< 7557 1726882103.16138: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882103.1351714-8702-186295646072761=/root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882103.16166: variable 'ansible_module_compression' from source: unknown 7557 1726882103.16212: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882103.16242: variable 'ansible_facts' from source: unknown 7557 1726882103.16302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/AnsiballZ_command.py 7557 1726882103.16408: Sending initial data 7557 1726882103.16411: Sent initial data (154 bytes) 7557 1726882103.16875: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.16879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882103.16882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.16885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882103.16887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.16934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.16937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.16990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.18509: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882103.18555: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882103.18606: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmppwt7cd6m /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/AnsiballZ_command.py <<< 7557 1726882103.18612: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/AnsiballZ_command.py" <<< 7557 1726882103.18652: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmppwt7cd6m" to remote "/root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/AnsiballZ_command.py" <<< 7557 1726882103.19186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.19237: stderr chunk (state=3): >>><<< 7557 1726882103.19240: stdout chunk (state=3): >>><<< 7557 1726882103.19279: done transferring module to remote 7557 1726882103.19290: _low_level_execute_command(): starting 7557 1726882103.19296: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/ /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/AnsiballZ_command.py && sleep 0' 7557 1726882103.19751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.19754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882103.19756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.19759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882103.19764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.19815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882103.19820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.19864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.21560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.21586: stderr chunk (state=3): >>><<< 7557 1726882103.21590: stdout chunk (state=3): >>><<< 7557 1726882103.21607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882103.21610: _low_level_execute_command(): starting 7557 1726882103.21613: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/AnsiballZ_command.py && sleep 0' 7557 1726882103.22066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.22071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882103.22073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.22075: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882103.22077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.22126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.22130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.22183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.37392: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:28:23.368941", "end": "2024-09-20 21:28:23.372184", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882103.39085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882103.39089: stdout chunk (state=3): >>><<< 7557 1726882103.39091: stderr chunk (state=3): >>><<< 7557 1726882103.39135: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:28:23.368941", "end": "2024-09-20 21:28:23.372184", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882103.39141: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882103.39144: _low_level_execute_command(): starting 7557 1726882103.39146: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882103.1351714-8702-186295646072761/ > /dev/null 2>&1 && sleep 0' 7557 1726882103.40581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882103.40590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.40715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.42526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.42530: stdout chunk (state=3): >>><<< 7557 1726882103.42532: stderr chunk (state=3): >>><<< 7557 1726882103.42555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882103.42567: handler run complete 7557 1726882103.42600: Evaluated conditional (False): False 7557 1726882103.42617: attempt loop complete, returning result 7557 1726882103.42801: _execute() done 7557 1726882103.42806: dumping result to json 7557 1726882103.42809: done dumping result, returning 7557 1726882103.42812: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-ed48-b3a5-000000001341] 7557 1726882103.42813: sending task result for task 12673a56-9f93-ed48-b3a5-000000001341 7557 1726882103.42887: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001341 7557 1726882103.42889: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003243", "end": "2024-09-20 21:28:23.372184", "rc": 0, "start": "2024-09-20 21:28:23.368941" } STDOUT: eth0 lo 7557 1726882103.42990: no more pending results, returning what we have 7557 1726882103.42996: results queue empty 7557 1726882103.42998: checking for any_errors_fatal 7557 1726882103.42999: done checking for any_errors_fatal 7557 1726882103.43000: checking for max_fail_percentage 7557 1726882103.43002: done checking for max_fail_percentage 7557 1726882103.43002: checking to see if all hosts have failed and the running result is not ok 7557 1726882103.43003: done checking to see if all hosts have failed 7557 1726882103.43004: getting the remaining hosts for this loop 7557 1726882103.43005: done getting the remaining hosts for this loop 7557 1726882103.43008: getting the next task for host managed_node3 7557 1726882103.43014: done getting next task for host managed_node3 7557 1726882103.43016: ^ task is: TASK: Set current_interfaces 7557 1726882103.43022: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882103.43026: getting variables 7557 1726882103.43027: in VariableManager get_vars() 7557 1726882103.43069: Calling all_inventory to load vars for managed_node3 7557 1726882103.43072: Calling groups_inventory to load vars for managed_node3 7557 1726882103.43074: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882103.43083: Calling all_plugins_play to load vars for managed_node3 7557 1726882103.43085: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882103.43088: Calling groups_plugins_play to load vars for managed_node3 7557 1726882103.45537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882103.47633: done with get_vars() 7557 1726882103.47664: done getting variables 7557 1726882103.47727: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:28:23 -0400 (0:00:00.381) 0:00:29.330 ****** 7557 1726882103.47767: entering _queue_task() for managed_node3/set_fact 7557 1726882103.48124: worker is 1 (out of 1 available) 7557 1726882103.48136: exiting _queue_task() for managed_node3/set_fact 7557 1726882103.48151: done queuing things up, now waiting for results queue to drain 7557 1726882103.48152: waiting for pending results... 7557 1726882103.48401: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7557 1726882103.48530: in run() - task 12673a56-9f93-ed48-b3a5-000000001342 7557 1726882103.48550: variable 'ansible_search_path' from source: unknown 7557 1726882103.48557: variable 'ansible_search_path' from source: unknown 7557 1726882103.48597: calling self._execute() 7557 1726882103.48702: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.48714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.48729: variable 'omit' from source: magic vars 7557 1726882103.49109: variable 'ansible_distribution_major_version' from source: facts 7557 1726882103.49127: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882103.49138: variable 'omit' from source: magic vars 7557 1726882103.49269: variable 'omit' from source: magic vars 7557 1726882103.49416: variable '_current_interfaces' from source: set_fact 7557 1726882103.49703: variable 'omit' from source: magic vars 7557 1726882103.49707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882103.49713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882103.49716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882103.49827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.49847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.49883: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882103.49905: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.49931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.50300: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882103.50303: Set connection var ansible_shell_executable to /bin/sh 7557 1726882103.50305: Set connection var ansible_shell_type to sh 7557 1726882103.50307: Set connection var ansible_pipelining to False 7557 1726882103.50309: Set connection var ansible_connection to ssh 7557 1726882103.50312: Set connection var ansible_timeout to 10 7557 1726882103.50314: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.50317: variable 'ansible_connection' from source: unknown 7557 1726882103.50319: variable 'ansible_module_compression' from source: unknown 7557 1726882103.50320: variable 'ansible_shell_type' from source: unknown 7557 1726882103.50322: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.50324: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.50326: variable 'ansible_pipelining' from source: unknown 7557 1726882103.50327: variable 'ansible_timeout' from source: unknown 7557 1726882103.50329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.50701: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882103.50770: variable 'omit' from source: magic vars 7557 1726882103.50779: starting attempt loop 7557 1726882103.50787: running the handler 7557 1726882103.50888: handler run complete 7557 1726882103.50892: attempt loop complete, returning result 7557 1726882103.50899: _execute() done 7557 1726882103.50901: dumping result to json 7557 1726882103.50903: done dumping result, returning 7557 1726882103.50906: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-ed48-b3a5-000000001342] 7557 1726882103.50908: sending task result for task 12673a56-9f93-ed48-b3a5-000000001342 7557 1726882103.50974: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001342 7557 1726882103.50978: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7557 1726882103.51051: no more pending results, returning what we have 7557 1726882103.51055: results queue empty 7557 1726882103.51056: checking for any_errors_fatal 7557 1726882103.51204: done checking for any_errors_fatal 7557 1726882103.51206: checking for max_fail_percentage 7557 1726882103.51208: done checking for max_fail_percentage 7557 1726882103.51209: checking to see if all hosts have failed and the running result is not ok 7557 1726882103.51210: done checking to see if all hosts have failed 7557 1726882103.51210: getting the remaining hosts for this loop 7557 1726882103.51212: done getting the remaining hosts for this loop 7557 1726882103.51215: getting the next task for host managed_node3 7557 1726882103.51224: done getting next task for host managed_node3 7557 1726882103.51227: ^ task is: TASK: Show current_interfaces 7557 1726882103.51231: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882103.51235: getting variables 7557 1726882103.51236: in VariableManager get_vars() 7557 1726882103.51356: Calling all_inventory to load vars for managed_node3 7557 1726882103.51359: Calling groups_inventory to load vars for managed_node3 7557 1726882103.51362: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882103.51371: Calling all_plugins_play to load vars for managed_node3 7557 1726882103.51373: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882103.51375: Calling groups_plugins_play to load vars for managed_node3 7557 1726882103.53097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882103.54649: done with get_vars() 7557 1726882103.54676: done getting variables 7557 1726882103.54736: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:28:23 -0400 (0:00:00.070) 0:00:29.400 ****** 7557 1726882103.54768: entering _queue_task() for managed_node3/debug 7557 1726882103.55101: worker is 1 (out of 1 available) 7557 1726882103.55115: exiting _queue_task() for managed_node3/debug 7557 1726882103.55128: done queuing things up, now waiting for results queue to drain 7557 1726882103.55130: waiting for pending results... 7557 1726882103.55517: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7557 1726882103.55523: in run() - task 12673a56-9f93-ed48-b3a5-00000000130b 7557 1726882103.55527: variable 'ansible_search_path' from source: unknown 7557 1726882103.55530: variable 'ansible_search_path' from source: unknown 7557 1726882103.55569: calling self._execute() 7557 1726882103.55675: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.55756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.55759: variable 'omit' from source: magic vars 7557 1726882103.56057: variable 'ansible_distribution_major_version' from source: facts 7557 1726882103.56070: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882103.56077: variable 'omit' from source: magic vars 7557 1726882103.56152: variable 'omit' from source: magic vars 7557 1726882103.56304: variable 'current_interfaces' from source: set_fact 7557 1726882103.56308: variable 'omit' from source: magic vars 7557 1726882103.56354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882103.56400: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882103.56434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882103.56456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.56474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.56517: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882103.56529: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.56538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.56658: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882103.56671: Set connection var ansible_shell_executable to /bin/sh 7557 1726882103.56678: Set connection var ansible_shell_type to sh 7557 1726882103.56737: Set connection var ansible_pipelining to False 7557 1726882103.56740: Set connection var ansible_connection to ssh 7557 1726882103.56742: Set connection var ansible_timeout to 10 7557 1726882103.56745: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.56747: variable 'ansible_connection' from source: unknown 7557 1726882103.56749: variable 'ansible_module_compression' from source: unknown 7557 1726882103.56752: variable 'ansible_shell_type' from source: unknown 7557 1726882103.56754: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.56761: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.56769: variable 'ansible_pipelining' from source: unknown 7557 1726882103.56776: variable 'ansible_timeout' from source: unknown 7557 1726882103.56783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.56942: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882103.57099: variable 'omit' from source: magic vars 7557 1726882103.57103: starting attempt loop 7557 1726882103.57105: running the handler 7557 1726882103.57108: handler run complete 7557 1726882103.57110: attempt loop complete, returning result 7557 1726882103.57112: _execute() done 7557 1726882103.57115: dumping result to json 7557 1726882103.57116: done dumping result, returning 7557 1726882103.57119: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-ed48-b3a5-00000000130b] 7557 1726882103.57121: sending task result for task 12673a56-9f93-ed48-b3a5-00000000130b 7557 1726882103.57188: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000130b 7557 1726882103.57191: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7557 1726882103.57247: no more pending results, returning what we have 7557 1726882103.57252: results queue empty 7557 1726882103.57253: checking for any_errors_fatal 7557 1726882103.57260: done checking for any_errors_fatal 7557 1726882103.57261: checking for max_fail_percentage 7557 1726882103.57263: done checking for max_fail_percentage 7557 1726882103.57264: checking to see if all hosts have failed and the running result is not ok 7557 1726882103.57265: done checking to see if all hosts have failed 7557 1726882103.57266: getting the remaining hosts for this loop 7557 1726882103.57267: done getting the remaining hosts for this loop 7557 1726882103.57272: getting the next task for host managed_node3 7557 1726882103.57281: done getting next task for host managed_node3 7557 1726882103.57284: ^ task is: TASK: Install iproute 7557 1726882103.57288: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882103.57296: getting variables 7557 1726882103.57298: in VariableManager get_vars() 7557 1726882103.57359: Calling all_inventory to load vars for managed_node3 7557 1726882103.57362: Calling groups_inventory to load vars for managed_node3 7557 1726882103.57365: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882103.57376: Calling all_plugins_play to load vars for managed_node3 7557 1726882103.57379: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882103.57382: Calling groups_plugins_play to load vars for managed_node3 7557 1726882103.59232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882103.60780: done with get_vars() 7557 1726882103.60809: done getting variables 7557 1726882103.60877: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:28:23 -0400 (0:00:00.061) 0:00:29.462 ****** 7557 1726882103.60913: entering _queue_task() for managed_node3/package 7557 1726882103.61271: worker is 1 (out of 1 available) 7557 1726882103.61284: exiting _queue_task() for managed_node3/package 7557 1726882103.61309: done queuing things up, now waiting for results queue to drain 7557 1726882103.61310: waiting for pending results... 7557 1726882103.61652: running TaskExecutor() for managed_node3/TASK: Install iproute 7557 1726882103.61671: in run() - task 12673a56-9f93-ed48-b3a5-0000000010ad 7557 1726882103.61719: variable 'ansible_search_path' from source: unknown 7557 1726882103.61724: variable 'ansible_search_path' from source: unknown 7557 1726882103.61751: calling self._execute() 7557 1726882103.61866: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.61877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.62162: variable 'omit' from source: magic vars 7557 1726882103.62507: variable 'ansible_distribution_major_version' from source: facts 7557 1726882103.62523: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882103.62533: variable 'omit' from source: magic vars 7557 1726882103.62572: variable 'omit' from source: magic vars 7557 1726882103.62774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882103.64920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882103.64996: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882103.65039: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882103.65078: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882103.65119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882103.65222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882103.65267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882103.65302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882103.65351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882103.65371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882103.65477: variable '__network_is_ostree' from source: set_fact 7557 1726882103.65526: variable 'omit' from source: magic vars 7557 1726882103.65529: variable 'omit' from source: magic vars 7557 1726882103.65554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882103.65587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882103.65615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882103.65643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.65659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882103.65698: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882103.65745: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.65748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.65838: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882103.65857: Set connection var ansible_shell_executable to /bin/sh 7557 1726882103.65865: Set connection var ansible_shell_type to sh 7557 1726882103.65876: Set connection var ansible_pipelining to False 7557 1726882103.65899: Set connection var ansible_connection to ssh 7557 1726882103.65903: Set connection var ansible_timeout to 10 7557 1726882103.65926: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.65961: variable 'ansible_connection' from source: unknown 7557 1726882103.65964: variable 'ansible_module_compression' from source: unknown 7557 1726882103.65967: variable 'ansible_shell_type' from source: unknown 7557 1726882103.65969: variable 'ansible_shell_executable' from source: unknown 7557 1726882103.65972: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882103.65974: variable 'ansible_pipelining' from source: unknown 7557 1726882103.65976: variable 'ansible_timeout' from source: unknown 7557 1726882103.65978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882103.66101: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882103.66104: variable 'omit' from source: magic vars 7557 1726882103.66114: starting attempt loop 7557 1726882103.66122: running the handler 7557 1726882103.66178: variable 'ansible_facts' from source: unknown 7557 1726882103.66181: variable 'ansible_facts' from source: unknown 7557 1726882103.66184: _low_level_execute_command(): starting 7557 1726882103.66192: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882103.66924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882103.66940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.66955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882103.67066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882103.67086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.67174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.68845: stdout chunk (state=3): >>>/root <<< 7557 1726882103.68978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.68995: stdout chunk (state=3): >>><<< 7557 1726882103.69029: stderr chunk (state=3): >>><<< 7557 1726882103.69063: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882103.69092: _low_level_execute_command(): starting 7557 1726882103.69128: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924 `" && echo ansible-tmp-1726882103.6907852-8724-215335117832924="` echo /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924 `" ) && sleep 0' 7557 1726882103.69970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882103.69973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.70053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882103.70082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.70219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.70230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882103.70255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.70343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.72204: stdout chunk (state=3): >>>ansible-tmp-1726882103.6907852-8724-215335117832924=/root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924 <<< 7557 1726882103.72361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.72364: stdout chunk (state=3): >>><<< 7557 1726882103.72368: stderr chunk (state=3): >>><<< 7557 1726882103.72390: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882103.6907852-8724-215335117832924=/root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882103.72601: variable 'ansible_module_compression' from source: unknown 7557 1726882103.72605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7557 1726882103.72608: variable 'ansible_facts' from source: unknown 7557 1726882103.72685: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/AnsiballZ_dnf.py 7557 1726882103.72967: Sending initial data 7557 1726882103.72971: Sent initial data (150 bytes) 7557 1726882103.73645: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882103.73662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.73677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882103.73812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882103.73816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.73845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882103.73864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.73949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.75478: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882103.75533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882103.75575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp7315d714 /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/AnsiballZ_dnf.py <<< 7557 1726882103.75600: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/AnsiballZ_dnf.py" <<< 7557 1726882103.75690: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp7315d714" to remote "/root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/AnsiballZ_dnf.py" <<< 7557 1726882103.76907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.76910: stdout chunk (state=3): >>><<< 7557 1726882103.76913: stderr chunk (state=3): >>><<< 7557 1726882103.77000: done transferring module to remote 7557 1726882103.77005: _low_level_execute_command(): starting 7557 1726882103.77008: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/ /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/AnsiballZ_dnf.py && sleep 0' 7557 1726882103.77635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882103.77651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.77686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.77708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882103.77805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.77832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.77835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882103.77863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.77918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882103.79646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882103.79698: stderr chunk (state=3): >>><<< 7557 1726882103.79702: stdout chunk (state=3): >>><<< 7557 1726882103.79704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882103.79806: _low_level_execute_command(): starting 7557 1726882103.79810: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/AnsiballZ_dnf.py && sleep 0' 7557 1726882103.80352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882103.80364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882103.80379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882103.80400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882103.80463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882103.80469: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882103.80541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882103.80556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882103.80623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.20647: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7557 1726882104.24645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882104.24649: stdout chunk (state=3): >>><<< 7557 1726882104.24652: stderr chunk (state=3): >>><<< 7557 1726882104.24654: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882104.24662: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882104.24665: _low_level_execute_command(): starting 7557 1726882104.24668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882103.6907852-8724-215335117832924/ > /dev/null 2>&1 && sleep 0' 7557 1726882104.25762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.25766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.25768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.25771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.25963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.26011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.27932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.27963: stderr chunk (state=3): >>><<< 7557 1726882104.27974: stdout chunk (state=3): >>><<< 7557 1726882104.28020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.28060: handler run complete 7557 1726882104.28636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882104.29042: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882104.29117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882104.29207: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882104.29314: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882104.29532: variable '__install_status' from source: set_fact 7557 1726882104.29554: Evaluated conditional (__install_status is success): True 7557 1726882104.29606: attempt loop complete, returning result 7557 1726882104.29679: _execute() done 7557 1726882104.29683: dumping result to json 7557 1726882104.29685: done dumping result, returning 7557 1726882104.29687: done running TaskExecutor() for managed_node3/TASK: Install iproute [12673a56-9f93-ed48-b3a5-0000000010ad] 7557 1726882104.29689: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ad ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7557 1726882104.30023: no more pending results, returning what we have 7557 1726882104.30027: results queue empty 7557 1726882104.30029: checking for any_errors_fatal 7557 1726882104.30035: done checking for any_errors_fatal 7557 1726882104.30036: checking for max_fail_percentage 7557 1726882104.30038: done checking for max_fail_percentage 7557 1726882104.30039: checking to see if all hosts have failed and the running result is not ok 7557 1726882104.30040: done checking to see if all hosts have failed 7557 1726882104.30041: getting the remaining hosts for this loop 7557 1726882104.30043: done getting the remaining hosts for this loop 7557 1726882104.30046: getting the next task for host managed_node3 7557 1726882104.30054: done getting next task for host managed_node3 7557 1726882104.30056: ^ task is: TASK: Create veth interface {{ interface }} 7557 1726882104.30059: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882104.30064: getting variables 7557 1726882104.30065: in VariableManager get_vars() 7557 1726882104.30227: Calling all_inventory to load vars for managed_node3 7557 1726882104.30234: Calling groups_inventory to load vars for managed_node3 7557 1726882104.30237: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882104.30243: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ad 7557 1726882104.30245: WORKER PROCESS EXITING 7557 1726882104.30255: Calling all_plugins_play to load vars for managed_node3 7557 1726882104.30257: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882104.30260: Calling groups_plugins_play to load vars for managed_node3 7557 1726882104.31930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882104.34361: done with get_vars() 7557 1726882104.34391: done getting variables 7557 1726882104.34457: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882104.34583: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:28:24 -0400 (0:00:00.737) 0:00:30.199 ****** 7557 1726882104.34616: entering _queue_task() for managed_node3/command 7557 1726882104.34943: worker is 1 (out of 1 available) 7557 1726882104.34958: exiting _queue_task() for managed_node3/command 7557 1726882104.34977: done queuing things up, now waiting for results queue to drain 7557 1726882104.34978: waiting for pending results... 7557 1726882104.35171: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7557 1726882104.35301: in run() - task 12673a56-9f93-ed48-b3a5-0000000010ae 7557 1726882104.35306: variable 'ansible_search_path' from source: unknown 7557 1726882104.35308: variable 'ansible_search_path' from source: unknown 7557 1726882104.35701: variable 'interface' from source: play vars 7557 1726882104.35705: variable 'interface' from source: play vars 7557 1726882104.35728: variable 'interface' from source: play vars 7557 1726882104.35980: Loaded config def from plugin (lookup/items) 7557 1726882104.36003: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7557 1726882104.36038: variable 'omit' from source: magic vars 7557 1726882104.36183: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882104.36206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882104.36224: variable 'omit' from source: magic vars 7557 1726882104.36469: variable 'ansible_distribution_major_version' from source: facts 7557 1726882104.36481: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882104.36684: variable 'type' from source: play vars 7557 1726882104.36697: variable 'state' from source: include params 7557 1726882104.36797: variable 'interface' from source: play vars 7557 1726882104.36801: variable 'current_interfaces' from source: set_fact 7557 1726882104.36803: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7557 1726882104.36806: variable 'omit' from source: magic vars 7557 1726882104.36808: variable 'omit' from source: magic vars 7557 1726882104.37148: variable 'item' from source: unknown 7557 1726882104.37151: variable 'item' from source: unknown 7557 1726882104.37154: variable 'omit' from source: magic vars 7557 1726882104.37156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882104.37159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882104.37161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882104.37163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882104.37166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882104.37600: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882104.37604: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882104.37606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882104.37609: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882104.37611: Set connection var ansible_shell_executable to /bin/sh 7557 1726882104.37613: Set connection var ansible_shell_type to sh 7557 1726882104.37615: Set connection var ansible_pipelining to False 7557 1726882104.37617: Set connection var ansible_connection to ssh 7557 1726882104.37620: Set connection var ansible_timeout to 10 7557 1726882104.37626: variable 'ansible_shell_executable' from source: unknown 7557 1726882104.37629: variable 'ansible_connection' from source: unknown 7557 1726882104.37631: variable 'ansible_module_compression' from source: unknown 7557 1726882104.37633: variable 'ansible_shell_type' from source: unknown 7557 1726882104.37635: variable 'ansible_shell_executable' from source: unknown 7557 1726882104.37637: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882104.37639: variable 'ansible_pipelining' from source: unknown 7557 1726882104.37641: variable 'ansible_timeout' from source: unknown 7557 1726882104.37643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882104.38044: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882104.38133: variable 'omit' from source: magic vars 7557 1726882104.38143: starting attempt loop 7557 1726882104.38150: running the handler 7557 1726882104.38171: _low_level_execute_command(): starting 7557 1726882104.38299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882104.39114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.39155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882104.39174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.39190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.39269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.40862: stdout chunk (state=3): >>>/root <<< 7557 1726882104.41055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.41059: stdout chunk (state=3): >>><<< 7557 1726882104.41157: stderr chunk (state=3): >>><<< 7557 1726882104.41200: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.41216: _low_level_execute_command(): starting 7557 1726882104.41223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994 `" && echo ansible-tmp-1726882104.4120035-8756-123976347988994="` echo /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994 `" ) && sleep 0' 7557 1726882104.41943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.41953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882104.41964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.41986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882104.42087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882104.42203: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.42207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.42213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.44050: stdout chunk (state=3): >>>ansible-tmp-1726882104.4120035-8756-123976347988994=/root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994 <<< 7557 1726882104.44188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.44191: stdout chunk (state=3): >>><<< 7557 1726882104.44202: stderr chunk (state=3): >>><<< 7557 1726882104.44237: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882104.4120035-8756-123976347988994=/root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.44267: variable 'ansible_module_compression' from source: unknown 7557 1726882104.44318: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882104.44566: variable 'ansible_facts' from source: unknown 7557 1726882104.44900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/AnsiballZ_command.py 7557 1726882104.45004: Sending initial data 7557 1726882104.45007: Sent initial data (154 bytes) 7557 1726882104.45764: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.45773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882104.45783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.45799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882104.45811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882104.45866: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.45913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882104.45931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.45943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.46015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.47520: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882104.47604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882104.47691: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmppay_qxe9 /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/AnsiballZ_command.py <<< 7557 1726882104.47698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/AnsiballZ_command.py" <<< 7557 1726882104.47723: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmppay_qxe9" to remote "/root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/AnsiballZ_command.py" <<< 7557 1726882104.48540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.48563: stderr chunk (state=3): >>><<< 7557 1726882104.48662: stdout chunk (state=3): >>><<< 7557 1726882104.48665: done transferring module to remote 7557 1726882104.48668: _low_level_execute_command(): starting 7557 1726882104.48673: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/ /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/AnsiballZ_command.py && sleep 0' 7557 1726882104.49351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.49368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882104.49408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.49428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882104.49516: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.49543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.49616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.51362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.51389: stdout chunk (state=3): >>><<< 7557 1726882104.51419: stderr chunk (state=3): >>><<< 7557 1726882104.51441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.51547: _low_level_execute_command(): starting 7557 1726882104.51551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/AnsiballZ_command.py && sleep 0' 7557 1726882104.52182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.52202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882104.52218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.52243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882104.52353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.52382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.52502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.68126: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:28:24.672177", "end": "2024-09-20 21:28:24.678748", "delta": "0:00:00.006571", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882104.70919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882104.70971: stderr chunk (state=3): >>><<< 7557 1726882104.70974: stdout chunk (state=3): >>><<< 7557 1726882104.70999: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:28:24.672177", "end": "2024-09-20 21:28:24.678748", "delta": "0:00:00.006571", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882104.71042: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882104.71141: _low_level_execute_command(): starting 7557 1726882104.71146: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882104.4120035-8756-123976347988994/ > /dev/null 2>&1 && sleep 0' 7557 1726882104.72026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.72109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.72233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.72256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.72331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.74377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.74381: stdout chunk (state=3): >>><<< 7557 1726882104.74388: stderr chunk (state=3): >>><<< 7557 1726882104.74444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.74485: handler run complete 7557 1726882104.74488: Evaluated conditional (False): False 7557 1726882104.74490: attempt loop complete, returning result 7557 1726882104.74521: variable 'item' from source: unknown 7557 1726882104.74700: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.006571", "end": "2024-09-20 21:28:24.678748", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:28:24.672177" } 7557 1726882104.74858: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882104.74862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882104.74865: variable 'omit' from source: magic vars 7557 1726882104.75102: variable 'ansible_distribution_major_version' from source: facts 7557 1726882104.75106: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882104.75505: variable 'type' from source: play vars 7557 1726882104.75511: variable 'state' from source: include params 7557 1726882104.75516: variable 'interface' from source: play vars 7557 1726882104.75518: variable 'current_interfaces' from source: set_fact 7557 1726882104.75702: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7557 1726882104.75705: variable 'omit' from source: magic vars 7557 1726882104.75707: variable 'omit' from source: magic vars 7557 1726882104.75709: variable 'item' from source: unknown 7557 1726882104.75738: variable 'item' from source: unknown 7557 1726882104.75871: variable 'omit' from source: magic vars 7557 1726882104.75976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882104.75986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882104.75998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882104.76010: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882104.76015: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882104.76104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882104.76242: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882104.76336: Set connection var ansible_shell_executable to /bin/sh 7557 1726882104.76340: Set connection var ansible_shell_type to sh 7557 1726882104.76346: Set connection var ansible_pipelining to False 7557 1726882104.76348: Set connection var ansible_connection to ssh 7557 1726882104.76355: Set connection var ansible_timeout to 10 7557 1726882104.76499: variable 'ansible_shell_executable' from source: unknown 7557 1726882104.76503: variable 'ansible_connection' from source: unknown 7557 1726882104.76505: variable 'ansible_module_compression' from source: unknown 7557 1726882104.76507: variable 'ansible_shell_type' from source: unknown 7557 1726882104.76509: variable 'ansible_shell_executable' from source: unknown 7557 1726882104.76511: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882104.76513: variable 'ansible_pipelining' from source: unknown 7557 1726882104.76515: variable 'ansible_timeout' from source: unknown 7557 1726882104.76517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882104.76610: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882104.76627: variable 'omit' from source: magic vars 7557 1726882104.76630: starting attempt loop 7557 1726882104.76633: running the handler 7557 1726882104.76644: _low_level_execute_command(): starting 7557 1726882104.76699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882104.77346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.77355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882104.77366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.77402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882104.77410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.77416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882104.77499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.77522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882104.77542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.77623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.79161: stdout chunk (state=3): >>>/root <<< 7557 1726882104.79401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.79405: stdout chunk (state=3): >>><<< 7557 1726882104.79407: stderr chunk (state=3): >>><<< 7557 1726882104.79410: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.79412: _low_level_execute_command(): starting 7557 1726882104.79414: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099 `" && echo ansible-tmp-1726882104.7931914-8756-236487298876099="` echo /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099 `" ) && sleep 0' 7557 1726882104.79983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.80012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.80081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.80127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.80170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.81997: stdout chunk (state=3): >>>ansible-tmp-1726882104.7931914-8756-236487298876099=/root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099 <<< 7557 1726882104.82157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.82161: stdout chunk (state=3): >>><<< 7557 1726882104.82163: stderr chunk (state=3): >>><<< 7557 1726882104.82183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882104.7931914-8756-236487298876099=/root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.82406: variable 'ansible_module_compression' from source: unknown 7557 1726882104.82410: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882104.82412: variable 'ansible_facts' from source: unknown 7557 1726882104.82414: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/AnsiballZ_command.py 7557 1726882104.82524: Sending initial data 7557 1726882104.82533: Sent initial data (154 bytes) 7557 1726882104.83138: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.83209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.83245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882104.83256: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.83273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.83340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.84829: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882104.84891: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882104.84968: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpw91vo_2l /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/AnsiballZ_command.py <<< 7557 1726882104.84978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/AnsiballZ_command.py" <<< 7557 1726882104.85002: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 7557 1726882104.85033: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpw91vo_2l" to remote "/root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/AnsiballZ_command.py" <<< 7557 1726882104.85985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.85996: stdout chunk (state=3): >>><<< 7557 1726882104.86015: stderr chunk (state=3): >>><<< 7557 1726882104.86072: done transferring module to remote 7557 1726882104.86084: _low_level_execute_command(): starting 7557 1726882104.86095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/ /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/AnsiballZ_command.py && sleep 0' 7557 1726882104.86703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.86713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882104.86724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.86808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.86828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882104.86845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.86856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.86928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882104.88767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882104.88772: stderr chunk (state=3): >>><<< 7557 1726882104.88775: stdout chunk (state=3): >>><<< 7557 1726882104.88778: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882104.88781: _low_level_execute_command(): starting 7557 1726882104.88783: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/AnsiballZ_command.py && sleep 0' 7557 1726882104.89340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882104.89356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882104.89372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882104.89389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882104.89411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882104.89427: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882104.89516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882104.89540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882104.89567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882104.89649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.04760: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:28:25.042463", "end": "2024-09-20 21:28:25.045842", "delta": "0:00:00.003379", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882105.06404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882105.06408: stderr chunk (state=3): >>><<< 7557 1726882105.06410: stdout chunk (state=3): >>><<< 7557 1726882105.06413: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:28:25.042463", "end": "2024-09-20 21:28:25.045842", "delta": "0:00:00.003379", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882105.06416: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882105.06418: _low_level_execute_command(): starting 7557 1726882105.06422: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882104.7931914-8756-236487298876099/ > /dev/null 2>&1 && sleep 0' 7557 1726882105.07211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882105.07226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.07242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.07260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882105.07276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882105.07308: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.07405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.07426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.07513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.09280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.09283: stdout chunk (state=3): >>><<< 7557 1726882105.09290: stderr chunk (state=3): >>><<< 7557 1726882105.09316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.09398: handler run complete 7557 1726882105.09401: Evaluated conditional (False): False 7557 1726882105.09404: attempt loop complete, returning result 7557 1726882105.09406: variable 'item' from source: unknown 7557 1726882105.09453: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003379", "end": "2024-09-20 21:28:25.045842", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:28:25.042463" } 7557 1726882105.09573: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.09577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.09580: variable 'omit' from source: magic vars 7557 1726882105.10302: variable 'ansible_distribution_major_version' from source: facts 7557 1726882105.10306: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882105.10308: variable 'type' from source: play vars 7557 1726882105.10310: variable 'state' from source: include params 7557 1726882105.10313: variable 'interface' from source: play vars 7557 1726882105.10314: variable 'current_interfaces' from source: set_fact 7557 1726882105.10316: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7557 1726882105.10318: variable 'omit' from source: magic vars 7557 1726882105.10320: variable 'omit' from source: magic vars 7557 1726882105.10322: variable 'item' from source: unknown 7557 1726882105.10332: variable 'item' from source: unknown 7557 1726882105.10347: variable 'omit' from source: magic vars 7557 1726882105.10373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882105.10386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882105.10394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882105.10411: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882105.10414: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.10417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.10799: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882105.10803: Set connection var ansible_shell_executable to /bin/sh 7557 1726882105.10805: Set connection var ansible_shell_type to sh 7557 1726882105.10807: Set connection var ansible_pipelining to False 7557 1726882105.10809: Set connection var ansible_connection to ssh 7557 1726882105.10811: Set connection var ansible_timeout to 10 7557 1726882105.10814: variable 'ansible_shell_executable' from source: unknown 7557 1726882105.10817: variable 'ansible_connection' from source: unknown 7557 1726882105.10820: variable 'ansible_module_compression' from source: unknown 7557 1726882105.10823: variable 'ansible_shell_type' from source: unknown 7557 1726882105.10825: variable 'ansible_shell_executable' from source: unknown 7557 1726882105.10828: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.10830: variable 'ansible_pipelining' from source: unknown 7557 1726882105.10833: variable 'ansible_timeout' from source: unknown 7557 1726882105.10835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.10838: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882105.10841: variable 'omit' from source: magic vars 7557 1726882105.10843: starting attempt loop 7557 1726882105.10846: running the handler 7557 1726882105.10849: _low_level_execute_command(): starting 7557 1726882105.10851: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882105.11443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882105.11456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.11467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.11480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882105.11507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882105.11524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.11568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.11610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882105.11627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.11645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.11718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.13373: stdout chunk (state=3): >>>/root <<< 7557 1726882105.13433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.13436: stdout chunk (state=3): >>><<< 7557 1726882105.13443: stderr chunk (state=3): >>><<< 7557 1726882105.13515: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.13592: _low_level_execute_command(): starting 7557 1726882105.13597: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691 `" && echo ansible-tmp-1726882105.1351385-8756-278406435272691="` echo /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691 `" ) && sleep 0' 7557 1726882105.14452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882105.14466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.14483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.14792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882105.14797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.14817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.14890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.16686: stdout chunk (state=3): >>>ansible-tmp-1726882105.1351385-8756-278406435272691=/root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691 <<< 7557 1726882105.16878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.16881: stdout chunk (state=3): >>><<< 7557 1726882105.16884: stderr chunk (state=3): >>><<< 7557 1726882105.17109: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882105.1351385-8756-278406435272691=/root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.17117: variable 'ansible_module_compression' from source: unknown 7557 1726882105.17119: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882105.17121: variable 'ansible_facts' from source: unknown 7557 1726882105.17124: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/AnsiballZ_command.py 7557 1726882105.17661: Sending initial data 7557 1726882105.17664: Sent initial data (154 bytes) 7557 1726882105.18670: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882105.18686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.18689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.18846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.18960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.19039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.19113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.20609: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7557 1726882105.20614: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882105.20679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882105.20744: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp1uis1dvh /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/AnsiballZ_command.py <<< 7557 1726882105.20766: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/AnsiballZ_command.py" <<< 7557 1726882105.20807: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp1uis1dvh" to remote "/root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/AnsiballZ_command.py" <<< 7557 1726882105.21734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.21980: stderr chunk (state=3): >>><<< 7557 1726882105.21983: stdout chunk (state=3): >>><<< 7557 1726882105.21986: done transferring module to remote 7557 1726882105.21988: _low_level_execute_command(): starting 7557 1726882105.21990: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/ /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/AnsiballZ_command.py && sleep 0' 7557 1726882105.22859: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882105.22958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.23043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.23106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.24782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.24785: stdout chunk (state=3): >>><<< 7557 1726882105.24786: stderr chunk (state=3): >>><<< 7557 1726882105.24800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.24828: _low_level_execute_command(): starting 7557 1726882105.24831: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/AnsiballZ_command.py && sleep 0' 7557 1726882105.25214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.25217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882105.25221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882105.25239: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.25246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.25297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.25301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.25357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.40561: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:28:25.400496", "end": "2024-09-20 21:28:25.403826", "delta": "0:00:00.003330", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 7557 1726882105.40616: stdout chunk (state=3): >>> <<< 7557 1726882105.41925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882105.41958: stderr chunk (state=3): >>><<< 7557 1726882105.41962: stdout chunk (state=3): >>><<< 7557 1726882105.41977: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:28:25.400496", "end": "2024-09-20 21:28:25.403826", "delta": "0:00:00.003330", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882105.42003: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882105.42008: _low_level_execute_command(): starting 7557 1726882105.42013: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882105.1351385-8756-278406435272691/ > /dev/null 2>&1 && sleep 0' 7557 1726882105.42456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.42460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882105.42514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.42544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882105.42547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.42550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.42602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.44335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.44363: stderr chunk (state=3): >>><<< 7557 1726882105.44366: stdout chunk (state=3): >>><<< 7557 1726882105.44381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.44385: handler run complete 7557 1726882105.44405: Evaluated conditional (False): False 7557 1726882105.44412: attempt loop complete, returning result 7557 1726882105.44430: variable 'item' from source: unknown 7557 1726882105.44492: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003330", "end": "2024-09-20 21:28:25.403826", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:28:25.400496" } 7557 1726882105.44610: dumping result to json 7557 1726882105.44618: done dumping result, returning 7557 1726882105.44620: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [12673a56-9f93-ed48-b3a5-0000000010ae] 7557 1726882105.44622: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ae 7557 1726882105.44665: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010ae 7557 1726882105.44668: WORKER PROCESS EXITING 7557 1726882105.44725: no more pending results, returning what we have 7557 1726882105.44729: results queue empty 7557 1726882105.44730: checking for any_errors_fatal 7557 1726882105.44738: done checking for any_errors_fatal 7557 1726882105.44738: checking for max_fail_percentage 7557 1726882105.44740: done checking for max_fail_percentage 7557 1726882105.44741: checking to see if all hosts have failed and the running result is not ok 7557 1726882105.44741: done checking to see if all hosts have failed 7557 1726882105.44742: getting the remaining hosts for this loop 7557 1726882105.44744: done getting the remaining hosts for this loop 7557 1726882105.44746: getting the next task for host managed_node3 7557 1726882105.44752: done getting next task for host managed_node3 7557 1726882105.44755: ^ task is: TASK: Set up veth as managed by NetworkManager 7557 1726882105.44758: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882105.44762: getting variables 7557 1726882105.44763: in VariableManager get_vars() 7557 1726882105.44814: Calling all_inventory to load vars for managed_node3 7557 1726882105.44817: Calling groups_inventory to load vars for managed_node3 7557 1726882105.44820: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882105.44830: Calling all_plugins_play to load vars for managed_node3 7557 1726882105.44833: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882105.44835: Calling groups_plugins_play to load vars for managed_node3 7557 1726882105.45747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882105.46601: done with get_vars() 7557 1726882105.46619: done getting variables 7557 1726882105.46664: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:28:25 -0400 (0:00:01.120) 0:00:31.319 ****** 7557 1726882105.46685: entering _queue_task() for managed_node3/command 7557 1726882105.46933: worker is 1 (out of 1 available) 7557 1726882105.46948: exiting _queue_task() for managed_node3/command 7557 1726882105.46961: done queuing things up, now waiting for results queue to drain 7557 1726882105.46962: waiting for pending results... 7557 1726882105.47140: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7557 1726882105.47213: in run() - task 12673a56-9f93-ed48-b3a5-0000000010af 7557 1726882105.47226: variable 'ansible_search_path' from source: unknown 7557 1726882105.47230: variable 'ansible_search_path' from source: unknown 7557 1726882105.47257: calling self._execute() 7557 1726882105.47336: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.47340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.47349: variable 'omit' from source: magic vars 7557 1726882105.47622: variable 'ansible_distribution_major_version' from source: facts 7557 1726882105.47632: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882105.47733: variable 'type' from source: play vars 7557 1726882105.47737: variable 'state' from source: include params 7557 1726882105.47747: Evaluated conditional (type == 'veth' and state == 'present'): True 7557 1726882105.47750: variable 'omit' from source: magic vars 7557 1726882105.47775: variable 'omit' from source: magic vars 7557 1726882105.47843: variable 'interface' from source: play vars 7557 1726882105.47859: variable 'omit' from source: magic vars 7557 1726882105.47891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882105.47921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882105.47937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882105.47950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882105.47962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882105.47985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882105.47988: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.47991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.48064: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882105.48068: Set connection var ansible_shell_executable to /bin/sh 7557 1726882105.48072: Set connection var ansible_shell_type to sh 7557 1726882105.48075: Set connection var ansible_pipelining to False 7557 1726882105.48078: Set connection var ansible_connection to ssh 7557 1726882105.48087: Set connection var ansible_timeout to 10 7557 1726882105.48104: variable 'ansible_shell_executable' from source: unknown 7557 1726882105.48107: variable 'ansible_connection' from source: unknown 7557 1726882105.48109: variable 'ansible_module_compression' from source: unknown 7557 1726882105.48111: variable 'ansible_shell_type' from source: unknown 7557 1726882105.48114: variable 'ansible_shell_executable' from source: unknown 7557 1726882105.48116: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.48119: variable 'ansible_pipelining' from source: unknown 7557 1726882105.48121: variable 'ansible_timeout' from source: unknown 7557 1726882105.48125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.48227: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882105.48236: variable 'omit' from source: magic vars 7557 1726882105.48239: starting attempt loop 7557 1726882105.48244: running the handler 7557 1726882105.48257: _low_level_execute_command(): starting 7557 1726882105.48263: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882105.48765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.48782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882105.48786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.48803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.48864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882105.48871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.48875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.48920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.50453: stdout chunk (state=3): >>>/root <<< 7557 1726882105.50548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.50577: stderr chunk (state=3): >>><<< 7557 1726882105.50581: stdout chunk (state=3): >>><<< 7557 1726882105.50607: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.50619: _low_level_execute_command(): starting 7557 1726882105.50626: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464 `" && echo ansible-tmp-1726882105.5060694-8819-84837068804464="` echo /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464 `" ) && sleep 0' 7557 1726882105.51075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.51079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882105.51089: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.51092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.51098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882105.51100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.51143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882105.51150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.51153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.51195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.52994: stdout chunk (state=3): >>>ansible-tmp-1726882105.5060694-8819-84837068804464=/root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464 <<< 7557 1726882105.53099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.53125: stderr chunk (state=3): >>><<< 7557 1726882105.53129: stdout chunk (state=3): >>><<< 7557 1726882105.53144: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882105.5060694-8819-84837068804464=/root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.53173: variable 'ansible_module_compression' from source: unknown 7557 1726882105.53217: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882105.53246: variable 'ansible_facts' from source: unknown 7557 1726882105.53305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/AnsiballZ_command.py 7557 1726882105.53409: Sending initial data 7557 1726882105.53413: Sent initial data (153 bytes) 7557 1726882105.53867: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.53871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882105.53873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882105.53876: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882105.53879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.53927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882105.53934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.53983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.55456: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882105.55500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882105.55543: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpl8tkmjod /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/AnsiballZ_command.py <<< 7557 1726882105.55547: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/AnsiballZ_command.py" <<< 7557 1726882105.55592: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpl8tkmjod" to remote "/root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/AnsiballZ_command.py" <<< 7557 1726882105.55598: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/AnsiballZ_command.py" <<< 7557 1726882105.56116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.56159: stderr chunk (state=3): >>><<< 7557 1726882105.56163: stdout chunk (state=3): >>><<< 7557 1726882105.56187: done transferring module to remote 7557 1726882105.56199: _low_level_execute_command(): starting 7557 1726882105.56205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/ /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/AnsiballZ_command.py && sleep 0' 7557 1726882105.56653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.56656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882105.56658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.56665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.56667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.56718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.56721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.56768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.58458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.58481: stderr chunk (state=3): >>><<< 7557 1726882105.58485: stdout chunk (state=3): >>><<< 7557 1726882105.58503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.58506: _low_level_execute_command(): starting 7557 1726882105.58510: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/AnsiballZ_command.py && sleep 0' 7557 1726882105.58969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.58972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.58975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.58978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.59033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882105.59041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.59045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.59095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.75810: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:28:25.738010", "end": "2024-09-20 21:28:25.755864", "delta": "0:00:00.017854", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882105.77202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882105.77229: stdout chunk (state=3): >>><<< 7557 1726882105.77232: stderr chunk (state=3): >>><<< 7557 1726882105.77373: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:28:25.738010", "end": "2024-09-20 21:28:25.755864", "delta": "0:00:00.017854", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882105.77377: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882105.77381: _low_level_execute_command(): starting 7557 1726882105.77383: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882105.5060694-8819-84837068804464/ > /dev/null 2>&1 && sleep 0' 7557 1726882105.77967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882105.77982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882105.78005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882105.78057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882105.78121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882105.78144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882105.78266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882105.80076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882105.80079: stdout chunk (state=3): >>><<< 7557 1726882105.80082: stderr chunk (state=3): >>><<< 7557 1726882105.80103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882105.80116: handler run complete 7557 1726882105.80199: Evaluated conditional (False): False 7557 1726882105.80202: attempt loop complete, returning result 7557 1726882105.80205: _execute() done 7557 1726882105.80206: dumping result to json 7557 1726882105.80208: done dumping result, returning 7557 1726882105.80210: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-ed48-b3a5-0000000010af] 7557 1726882105.80213: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010af ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.017854", "end": "2024-09-20 21:28:25.755864", "rc": 0, "start": "2024-09-20 21:28:25.738010" } 7557 1726882105.80563: no more pending results, returning what we have 7557 1726882105.80567: results queue empty 7557 1726882105.80568: checking for any_errors_fatal 7557 1726882105.80583: done checking for any_errors_fatal 7557 1726882105.80584: checking for max_fail_percentage 7557 1726882105.80586: done checking for max_fail_percentage 7557 1726882105.80587: checking to see if all hosts have failed and the running result is not ok 7557 1726882105.80588: done checking to see if all hosts have failed 7557 1726882105.80588: getting the remaining hosts for this loop 7557 1726882105.80590: done getting the remaining hosts for this loop 7557 1726882105.80598: getting the next task for host managed_node3 7557 1726882105.80604: done getting next task for host managed_node3 7557 1726882105.80607: ^ task is: TASK: Delete veth interface {{ interface }} 7557 1726882105.80611: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882105.80616: getting variables 7557 1726882105.80617: in VariableManager get_vars() 7557 1726882105.80672: Calling all_inventory to load vars for managed_node3 7557 1726882105.80675: Calling groups_inventory to load vars for managed_node3 7557 1726882105.80678: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882105.80689: Calling all_plugins_play to load vars for managed_node3 7557 1726882105.80811: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882105.80818: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010af 7557 1726882105.80822: WORKER PROCESS EXITING 7557 1726882105.80827: Calling groups_plugins_play to load vars for managed_node3 7557 1726882105.82483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882105.84150: done with get_vars() 7557 1726882105.84175: done getting variables 7557 1726882105.84237: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882105.84350: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:28:25 -0400 (0:00:00.376) 0:00:31.696 ****** 7557 1726882105.84381: entering _queue_task() for managed_node3/command 7557 1726882105.84716: worker is 1 (out of 1 available) 7557 1726882105.84729: exiting _queue_task() for managed_node3/command 7557 1726882105.84742: done queuing things up, now waiting for results queue to drain 7557 1726882105.84743: waiting for pending results... 7557 1726882105.85035: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7557 1726882105.85130: in run() - task 12673a56-9f93-ed48-b3a5-0000000010b0 7557 1726882105.85144: variable 'ansible_search_path' from source: unknown 7557 1726882105.85147: variable 'ansible_search_path' from source: unknown 7557 1726882105.85183: calling self._execute() 7557 1726882105.85286: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.85296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.85305: variable 'omit' from source: magic vars 7557 1726882105.85672: variable 'ansible_distribution_major_version' from source: facts 7557 1726882105.85685: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882105.85882: variable 'type' from source: play vars 7557 1726882105.85886: variable 'state' from source: include params 7557 1726882105.85891: variable 'interface' from source: play vars 7557 1726882105.85967: variable 'current_interfaces' from source: set_fact 7557 1726882105.85971: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7557 1726882105.85974: when evaluation is False, skipping this task 7557 1726882105.85977: _execute() done 7557 1726882105.85979: dumping result to json 7557 1726882105.85982: done dumping result, returning 7557 1726882105.85985: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [12673a56-9f93-ed48-b3a5-0000000010b0] 7557 1726882105.85987: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b0 7557 1726882105.86051: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b0 7557 1726882105.86054: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882105.86108: no more pending results, returning what we have 7557 1726882105.86113: results queue empty 7557 1726882105.86114: checking for any_errors_fatal 7557 1726882105.86123: done checking for any_errors_fatal 7557 1726882105.86123: checking for max_fail_percentage 7557 1726882105.86125: done checking for max_fail_percentage 7557 1726882105.86126: checking to see if all hosts have failed and the running result is not ok 7557 1726882105.86127: done checking to see if all hosts have failed 7557 1726882105.86127: getting the remaining hosts for this loop 7557 1726882105.86129: done getting the remaining hosts for this loop 7557 1726882105.86132: getting the next task for host managed_node3 7557 1726882105.86139: done getting next task for host managed_node3 7557 1726882105.86142: ^ task is: TASK: Create dummy interface {{ interface }} 7557 1726882105.86145: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882105.86150: getting variables 7557 1726882105.86151: in VariableManager get_vars() 7557 1726882105.86204: Calling all_inventory to load vars for managed_node3 7557 1726882105.86208: Calling groups_inventory to load vars for managed_node3 7557 1726882105.86210: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882105.86222: Calling all_plugins_play to load vars for managed_node3 7557 1726882105.86225: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882105.86228: Calling groups_plugins_play to load vars for managed_node3 7557 1726882105.87756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882105.89246: done with get_vars() 7557 1726882105.89278: done getting variables 7557 1726882105.89338: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882105.89446: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:28:25 -0400 (0:00:00.050) 0:00:31.747 ****** 7557 1726882105.89476: entering _queue_task() for managed_node3/command 7557 1726882105.89818: worker is 1 (out of 1 available) 7557 1726882105.89832: exiting _queue_task() for managed_node3/command 7557 1726882105.89845: done queuing things up, now waiting for results queue to drain 7557 1726882105.89846: waiting for pending results... 7557 1726882105.90186: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7557 1726882105.90321: in run() - task 12673a56-9f93-ed48-b3a5-0000000010b1 7557 1726882105.90325: variable 'ansible_search_path' from source: unknown 7557 1726882105.90328: variable 'ansible_search_path' from source: unknown 7557 1726882105.90331: calling self._execute() 7557 1726882105.90390: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882105.90398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882105.90409: variable 'omit' from source: magic vars 7557 1726882105.90873: variable 'ansible_distribution_major_version' from source: facts 7557 1726882105.90876: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882105.90989: variable 'type' from source: play vars 7557 1726882105.90998: variable 'state' from source: include params 7557 1726882105.91002: variable 'interface' from source: play vars 7557 1726882105.91005: variable 'current_interfaces' from source: set_fact 7557 1726882105.91013: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7557 1726882105.91016: when evaluation is False, skipping this task 7557 1726882105.91019: _execute() done 7557 1726882105.91022: dumping result to json 7557 1726882105.91024: done dumping result, returning 7557 1726882105.91031: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [12673a56-9f93-ed48-b3a5-0000000010b1] 7557 1726882105.91036: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b1 7557 1726882105.91129: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b1 7557 1726882105.91133: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882105.91182: no more pending results, returning what we have 7557 1726882105.91187: results queue empty 7557 1726882105.91188: checking for any_errors_fatal 7557 1726882105.91198: done checking for any_errors_fatal 7557 1726882105.91199: checking for max_fail_percentage 7557 1726882105.91201: done checking for max_fail_percentage 7557 1726882105.91202: checking to see if all hosts have failed and the running result is not ok 7557 1726882105.91203: done checking to see if all hosts have failed 7557 1726882105.91203: getting the remaining hosts for this loop 7557 1726882105.91205: done getting the remaining hosts for this loop 7557 1726882105.91209: getting the next task for host managed_node3 7557 1726882105.91216: done getting next task for host managed_node3 7557 1726882105.91219: ^ task is: TASK: Delete dummy interface {{ interface }} 7557 1726882105.91224: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882105.91229: getting variables 7557 1726882105.91231: in VariableManager get_vars() 7557 1726882105.91285: Calling all_inventory to load vars for managed_node3 7557 1726882105.91288: Calling groups_inventory to load vars for managed_node3 7557 1726882105.91291: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882105.91544: Calling all_plugins_play to load vars for managed_node3 7557 1726882105.91548: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882105.91551: Calling groups_plugins_play to load vars for managed_node3 7557 1726882105.97937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.01200: done with get_vars() 7557 1726882106.01235: done getting variables 7557 1726882106.01283: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882106.01479: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:28:26 -0400 (0:00:00.121) 0:00:31.869 ****** 7557 1726882106.01609: entering _queue_task() for managed_node3/command 7557 1726882106.02363: worker is 1 (out of 1 available) 7557 1726882106.02376: exiting _queue_task() for managed_node3/command 7557 1726882106.02389: done queuing things up, now waiting for results queue to drain 7557 1726882106.02390: waiting for pending results... 7557 1726882106.03054: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7557 1726882106.03151: in run() - task 12673a56-9f93-ed48-b3a5-0000000010b2 7557 1726882106.03172: variable 'ansible_search_path' from source: unknown 7557 1726882106.03180: variable 'ansible_search_path' from source: unknown 7557 1726882106.03240: calling self._execute() 7557 1726882106.03464: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.03475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.03479: variable 'omit' from source: magic vars 7557 1726882106.03867: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.03899: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.04177: variable 'type' from source: play vars 7557 1726882106.04189: variable 'state' from source: include params 7557 1726882106.04205: variable 'interface' from source: play vars 7557 1726882106.04216: variable 'current_interfaces' from source: set_fact 7557 1726882106.04239: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7557 1726882106.04263: when evaluation is False, skipping this task 7557 1726882106.04266: _execute() done 7557 1726882106.04269: dumping result to json 7557 1726882106.04348: done dumping result, returning 7557 1726882106.04352: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [12673a56-9f93-ed48-b3a5-0000000010b2] 7557 1726882106.04355: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b2 7557 1726882106.04437: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b2 7557 1726882106.04440: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882106.04503: no more pending results, returning what we have 7557 1726882106.04507: results queue empty 7557 1726882106.04508: checking for any_errors_fatal 7557 1726882106.04517: done checking for any_errors_fatal 7557 1726882106.04517: checking for max_fail_percentage 7557 1726882106.04520: done checking for max_fail_percentage 7557 1726882106.04521: checking to see if all hosts have failed and the running result is not ok 7557 1726882106.04522: done checking to see if all hosts have failed 7557 1726882106.04522: getting the remaining hosts for this loop 7557 1726882106.04524: done getting the remaining hosts for this loop 7557 1726882106.04527: getting the next task for host managed_node3 7557 1726882106.04535: done getting next task for host managed_node3 7557 1726882106.04538: ^ task is: TASK: Create tap interface {{ interface }} 7557 1726882106.04542: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882106.04547: getting variables 7557 1726882106.04549: in VariableManager get_vars() 7557 1726882106.04728: Calling all_inventory to load vars for managed_node3 7557 1726882106.04731: Calling groups_inventory to load vars for managed_node3 7557 1726882106.04734: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.04748: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.04751: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.04754: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.06469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.08175: done with get_vars() 7557 1726882106.08207: done getting variables 7557 1726882106.08278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882106.08409: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:28:26 -0400 (0:00:00.068) 0:00:31.937 ****** 7557 1726882106.08442: entering _queue_task() for managed_node3/command 7557 1726882106.08920: worker is 1 (out of 1 available) 7557 1726882106.08931: exiting _queue_task() for managed_node3/command 7557 1726882106.08945: done queuing things up, now waiting for results queue to drain 7557 1726882106.08946: waiting for pending results... 7557 1726882106.09327: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7557 1726882106.09332: in run() - task 12673a56-9f93-ed48-b3a5-0000000010b3 7557 1726882106.09336: variable 'ansible_search_path' from source: unknown 7557 1726882106.09344: variable 'ansible_search_path' from source: unknown 7557 1726882106.09370: calling self._execute() 7557 1726882106.09491: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.09508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.09522: variable 'omit' from source: magic vars 7557 1726882106.09968: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.09971: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.10170: variable 'type' from source: play vars 7557 1726882106.10186: variable 'state' from source: include params 7557 1726882106.10200: variable 'interface' from source: play vars 7557 1726882106.10215: variable 'current_interfaces' from source: set_fact 7557 1726882106.10230: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7557 1726882106.10238: when evaluation is False, skipping this task 7557 1726882106.10244: _execute() done 7557 1726882106.10296: dumping result to json 7557 1726882106.10300: done dumping result, returning 7557 1726882106.10303: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [12673a56-9f93-ed48-b3a5-0000000010b3] 7557 1726882106.10305: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b3 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882106.10480: no more pending results, returning what we have 7557 1726882106.10484: results queue empty 7557 1726882106.10485: checking for any_errors_fatal 7557 1726882106.10490: done checking for any_errors_fatal 7557 1726882106.10491: checking for max_fail_percentage 7557 1726882106.10497: done checking for max_fail_percentage 7557 1726882106.10498: checking to see if all hosts have failed and the running result is not ok 7557 1726882106.10499: done checking to see if all hosts have failed 7557 1726882106.10499: getting the remaining hosts for this loop 7557 1726882106.10501: done getting the remaining hosts for this loop 7557 1726882106.10506: getting the next task for host managed_node3 7557 1726882106.10512: done getting next task for host managed_node3 7557 1726882106.10515: ^ task is: TASK: Delete tap interface {{ interface }} 7557 1726882106.10519: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882106.10524: getting variables 7557 1726882106.10526: in VariableManager get_vars() 7557 1726882106.10583: Calling all_inventory to load vars for managed_node3 7557 1726882106.10586: Calling groups_inventory to load vars for managed_node3 7557 1726882106.10589: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.10599: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b3 7557 1726882106.10602: WORKER PROCESS EXITING 7557 1726882106.10808: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.10812: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.10815: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.12363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.13241: done with get_vars() 7557 1726882106.13258: done getting variables 7557 1726882106.13308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882106.13388: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:28:26 -0400 (0:00:00.049) 0:00:31.987 ****** 7557 1726882106.13417: entering _queue_task() for managed_node3/command 7557 1726882106.13659: worker is 1 (out of 1 available) 7557 1726882106.13673: exiting _queue_task() for managed_node3/command 7557 1726882106.13686: done queuing things up, now waiting for results queue to drain 7557 1726882106.13687: waiting for pending results... 7557 1726882106.13864: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7557 1726882106.13942: in run() - task 12673a56-9f93-ed48-b3a5-0000000010b4 7557 1726882106.13953: variable 'ansible_search_path' from source: unknown 7557 1726882106.13957: variable 'ansible_search_path' from source: unknown 7557 1726882106.13986: calling self._execute() 7557 1726882106.14083: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.14086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.14099: variable 'omit' from source: magic vars 7557 1726882106.14599: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.14603: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.14709: variable 'type' from source: play vars 7557 1726882106.14729: variable 'state' from source: include params 7557 1726882106.14739: variable 'interface' from source: play vars 7557 1726882106.14746: variable 'current_interfaces' from source: set_fact 7557 1726882106.14757: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7557 1726882106.14765: when evaluation is False, skipping this task 7557 1726882106.14771: _execute() done 7557 1726882106.14777: dumping result to json 7557 1726882106.14783: done dumping result, returning 7557 1726882106.14797: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [12673a56-9f93-ed48-b3a5-0000000010b4] 7557 1726882106.14831: sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b4 7557 1726882106.15119: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000010b4 7557 1726882106.15122: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882106.15168: no more pending results, returning what we have 7557 1726882106.15171: results queue empty 7557 1726882106.15172: checking for any_errors_fatal 7557 1726882106.15176: done checking for any_errors_fatal 7557 1726882106.15177: checking for max_fail_percentage 7557 1726882106.15178: done checking for max_fail_percentage 7557 1726882106.15179: checking to see if all hosts have failed and the running result is not ok 7557 1726882106.15180: done checking to see if all hosts have failed 7557 1726882106.15181: getting the remaining hosts for this loop 7557 1726882106.15182: done getting the remaining hosts for this loop 7557 1726882106.15185: getting the next task for host managed_node3 7557 1726882106.15196: done getting next task for host managed_node3 7557 1726882106.15200: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882106.15203: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882106.15222: getting variables 7557 1726882106.15223: in VariableManager get_vars() 7557 1726882106.15270: Calling all_inventory to load vars for managed_node3 7557 1726882106.15272: Calling groups_inventory to load vars for managed_node3 7557 1726882106.15275: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.15284: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.15287: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.15290: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.16065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.16940: done with get_vars() 7557 1726882106.16957: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:28:26 -0400 (0:00:00.036) 0:00:32.023 ****** 7557 1726882106.17030: entering _queue_task() for managed_node3/include_tasks 7557 1726882106.17262: worker is 1 (out of 1 available) 7557 1726882106.17274: exiting _queue_task() for managed_node3/include_tasks 7557 1726882106.17286: done queuing things up, now waiting for results queue to drain 7557 1726882106.17288: waiting for pending results... 7557 1726882106.17628: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882106.17640: in run() - task 12673a56-9f93-ed48-b3a5-0000000000b8 7557 1726882106.17655: variable 'ansible_search_path' from source: unknown 7557 1726882106.17659: variable 'ansible_search_path' from source: unknown 7557 1726882106.17699: calling self._execute() 7557 1726882106.17796: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.17801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.17817: variable 'omit' from source: magic vars 7557 1726882106.18183: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.18200: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.18205: _execute() done 7557 1726882106.18208: dumping result to json 7557 1726882106.18211: done dumping result, returning 7557 1726882106.18222: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-ed48-b3a5-0000000000b8] 7557 1726882106.18225: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000b8 7557 1726882106.18398: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000b8 7557 1726882106.18402: WORKER PROCESS EXITING 7557 1726882106.18444: no more pending results, returning what we have 7557 1726882106.18448: in VariableManager get_vars() 7557 1726882106.18553: Calling all_inventory to load vars for managed_node3 7557 1726882106.18556: Calling groups_inventory to load vars for managed_node3 7557 1726882106.18559: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.18569: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.18571: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.18574: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.19625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.20478: done with get_vars() 7557 1726882106.20492: variable 'ansible_search_path' from source: unknown 7557 1726882106.20494: variable 'ansible_search_path' from source: unknown 7557 1726882106.20522: we have included files to process 7557 1726882106.20523: generating all_blocks data 7557 1726882106.20525: done generating all_blocks data 7557 1726882106.20529: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882106.20530: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882106.20531: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882106.21028: done processing included file 7557 1726882106.21030: iterating over new_blocks loaded from include file 7557 1726882106.21031: in VariableManager get_vars() 7557 1726882106.21060: done with get_vars() 7557 1726882106.21061: filtering new block on tags 7557 1726882106.21078: done filtering new block on tags 7557 1726882106.21080: in VariableManager get_vars() 7557 1726882106.21108: done with get_vars() 7557 1726882106.21110: filtering new block on tags 7557 1726882106.21133: done filtering new block on tags 7557 1726882106.21136: in VariableManager get_vars() 7557 1726882106.21164: done with get_vars() 7557 1726882106.21166: filtering new block on tags 7557 1726882106.21186: done filtering new block on tags 7557 1726882106.21188: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7557 1726882106.21195: extending task lists for all hosts with included blocks 7557 1726882106.21910: done extending task lists 7557 1726882106.21911: done processing included files 7557 1726882106.21912: results queue empty 7557 1726882106.21912: checking for any_errors_fatal 7557 1726882106.21914: done checking for any_errors_fatal 7557 1726882106.21915: checking for max_fail_percentage 7557 1726882106.21915: done checking for max_fail_percentage 7557 1726882106.21916: checking to see if all hosts have failed and the running result is not ok 7557 1726882106.21916: done checking to see if all hosts have failed 7557 1726882106.21917: getting the remaining hosts for this loop 7557 1726882106.21917: done getting the remaining hosts for this loop 7557 1726882106.21919: getting the next task for host managed_node3 7557 1726882106.21922: done getting next task for host managed_node3 7557 1726882106.21923: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882106.21925: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882106.21937: getting variables 7557 1726882106.21938: in VariableManager get_vars() 7557 1726882106.21957: Calling all_inventory to load vars for managed_node3 7557 1726882106.21959: Calling groups_inventory to load vars for managed_node3 7557 1726882106.21961: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.21966: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.21968: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.21969: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.23075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.23946: done with get_vars() 7557 1726882106.23960: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:28:26 -0400 (0:00:00.069) 0:00:32.093 ****** 7557 1726882106.24015: entering _queue_task() for managed_node3/setup 7557 1726882106.24256: worker is 1 (out of 1 available) 7557 1726882106.24269: exiting _queue_task() for managed_node3/setup 7557 1726882106.24281: done queuing things up, now waiting for results queue to drain 7557 1726882106.24283: waiting for pending results... 7557 1726882106.24466: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882106.24567: in run() - task 12673a56-9f93-ed48-b3a5-000000001381 7557 1726882106.24578: variable 'ansible_search_path' from source: unknown 7557 1726882106.24582: variable 'ansible_search_path' from source: unknown 7557 1726882106.24622: calling self._execute() 7557 1726882106.24692: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.24701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.24710: variable 'omit' from source: magic vars 7557 1726882106.25199: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.25202: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.25313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882106.26978: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882106.27036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882106.27064: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882106.27089: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882106.27114: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882106.27169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882106.27192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882106.27213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882106.27240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882106.27251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882106.27290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882106.27311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882106.27327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882106.27351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882106.27361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882106.27466: variable '__network_required_facts' from source: role '' defaults 7557 1726882106.27474: variable 'ansible_facts' from source: unknown 7557 1726882106.27936: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7557 1726882106.27940: when evaluation is False, skipping this task 7557 1726882106.27942: _execute() done 7557 1726882106.27945: dumping result to json 7557 1726882106.27947: done dumping result, returning 7557 1726882106.27954: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-ed48-b3a5-000000001381] 7557 1726882106.27959: sending task result for task 12673a56-9f93-ed48-b3a5-000000001381 7557 1726882106.28044: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001381 7557 1726882106.28046: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882106.28086: no more pending results, returning what we have 7557 1726882106.28089: results queue empty 7557 1726882106.28090: checking for any_errors_fatal 7557 1726882106.28091: done checking for any_errors_fatal 7557 1726882106.28092: checking for max_fail_percentage 7557 1726882106.28096: done checking for max_fail_percentage 7557 1726882106.28097: checking to see if all hosts have failed and the running result is not ok 7557 1726882106.28098: done checking to see if all hosts have failed 7557 1726882106.28098: getting the remaining hosts for this loop 7557 1726882106.28100: done getting the remaining hosts for this loop 7557 1726882106.28103: getting the next task for host managed_node3 7557 1726882106.28111: done getting next task for host managed_node3 7557 1726882106.28115: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882106.28119: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882106.28140: getting variables 7557 1726882106.28141: in VariableManager get_vars() 7557 1726882106.28190: Calling all_inventory to load vars for managed_node3 7557 1726882106.28192: Calling groups_inventory to load vars for managed_node3 7557 1726882106.28203: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.28213: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.28215: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.28218: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.29474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.30403: done with get_vars() 7557 1726882106.30418: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:28:26 -0400 (0:00:00.064) 0:00:32.157 ****** 7557 1726882106.30488: entering _queue_task() for managed_node3/stat 7557 1726882106.30709: worker is 1 (out of 1 available) 7557 1726882106.30724: exiting _queue_task() for managed_node3/stat 7557 1726882106.30737: done queuing things up, now waiting for results queue to drain 7557 1726882106.30738: waiting for pending results... 7557 1726882106.30917: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882106.31014: in run() - task 12673a56-9f93-ed48-b3a5-000000001383 7557 1726882106.31027: variable 'ansible_search_path' from source: unknown 7557 1726882106.31030: variable 'ansible_search_path' from source: unknown 7557 1726882106.31057: calling self._execute() 7557 1726882106.31136: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.31139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.31149: variable 'omit' from source: magic vars 7557 1726882106.31413: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.31422: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.31537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882106.31766: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882106.31898: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882106.31901: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882106.31904: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882106.31932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882106.32002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882106.32034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882106.32065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882106.32158: variable '__network_is_ostree' from source: set_fact 7557 1726882106.32171: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882106.32179: when evaluation is False, skipping this task 7557 1726882106.32188: _execute() done 7557 1726882106.32197: dumping result to json 7557 1726882106.32206: done dumping result, returning 7557 1726882106.32217: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-ed48-b3a5-000000001383] 7557 1726882106.32227: sending task result for task 12673a56-9f93-ed48-b3a5-000000001383 7557 1726882106.32399: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001383 7557 1726882106.32403: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882106.32459: no more pending results, returning what we have 7557 1726882106.32463: results queue empty 7557 1726882106.32464: checking for any_errors_fatal 7557 1726882106.32470: done checking for any_errors_fatal 7557 1726882106.32471: checking for max_fail_percentage 7557 1726882106.32473: done checking for max_fail_percentage 7557 1726882106.32474: checking to see if all hosts have failed and the running result is not ok 7557 1726882106.32475: done checking to see if all hosts have failed 7557 1726882106.32476: getting the remaining hosts for this loop 7557 1726882106.32477: done getting the remaining hosts for this loop 7557 1726882106.32481: getting the next task for host managed_node3 7557 1726882106.32487: done getting next task for host managed_node3 7557 1726882106.32491: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882106.32497: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882106.32519: getting variables 7557 1726882106.32520: in VariableManager get_vars() 7557 1726882106.32574: Calling all_inventory to load vars for managed_node3 7557 1726882106.32577: Calling groups_inventory to load vars for managed_node3 7557 1726882106.32580: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.32590: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.32796: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.32802: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.33728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.34578: done with get_vars() 7557 1726882106.34595: done getting variables 7557 1726882106.34645: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:28:26 -0400 (0:00:00.041) 0:00:32.199 ****** 7557 1726882106.34683: entering _queue_task() for managed_node3/set_fact 7557 1726882106.34945: worker is 1 (out of 1 available) 7557 1726882106.34958: exiting _queue_task() for managed_node3/set_fact 7557 1726882106.34971: done queuing things up, now waiting for results queue to drain 7557 1726882106.34972: waiting for pending results... 7557 1726882106.35256: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882106.35423: in run() - task 12673a56-9f93-ed48-b3a5-000000001384 7557 1726882106.35599: variable 'ansible_search_path' from source: unknown 7557 1726882106.35602: variable 'ansible_search_path' from source: unknown 7557 1726882106.35604: calling self._execute() 7557 1726882106.35607: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.35609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.35612: variable 'omit' from source: magic vars 7557 1726882106.35944: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.35962: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.36128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882106.36401: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882106.36450: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882106.36490: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882106.36532: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882106.36618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882106.36684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882106.36721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882106.36751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882106.36850: variable '__network_is_ostree' from source: set_fact 7557 1726882106.36866: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882106.36876: when evaluation is False, skipping this task 7557 1726882106.36885: _execute() done 7557 1726882106.36896: dumping result to json 7557 1726882106.36906: done dumping result, returning 7557 1726882106.37026: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-ed48-b3a5-000000001384] 7557 1726882106.37029: sending task result for task 12673a56-9f93-ed48-b3a5-000000001384 7557 1726882106.37092: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001384 7557 1726882106.37097: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882106.37173: no more pending results, returning what we have 7557 1726882106.37177: results queue empty 7557 1726882106.37178: checking for any_errors_fatal 7557 1726882106.37184: done checking for any_errors_fatal 7557 1726882106.37185: checking for max_fail_percentage 7557 1726882106.37186: done checking for max_fail_percentage 7557 1726882106.37187: checking to see if all hosts have failed and the running result is not ok 7557 1726882106.37188: done checking to see if all hosts have failed 7557 1726882106.37189: getting the remaining hosts for this loop 7557 1726882106.37191: done getting the remaining hosts for this loop 7557 1726882106.37196: getting the next task for host managed_node3 7557 1726882106.37206: done getting next task for host managed_node3 7557 1726882106.37210: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882106.37214: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882106.37234: getting variables 7557 1726882106.37236: in VariableManager get_vars() 7557 1726882106.37287: Calling all_inventory to load vars for managed_node3 7557 1726882106.37290: Calling groups_inventory to load vars for managed_node3 7557 1726882106.37459: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882106.37469: Calling all_plugins_play to load vars for managed_node3 7557 1726882106.37471: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882106.37474: Calling groups_plugins_play to load vars for managed_node3 7557 1726882106.38716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882106.40221: done with get_vars() 7557 1726882106.40244: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:28:26 -0400 (0:00:00.056) 0:00:32.256 ****** 7557 1726882106.40341: entering _queue_task() for managed_node3/service_facts 7557 1726882106.40626: worker is 1 (out of 1 available) 7557 1726882106.40639: exiting _queue_task() for managed_node3/service_facts 7557 1726882106.40651: done queuing things up, now waiting for results queue to drain 7557 1726882106.40653: waiting for pending results... 7557 1726882106.40944: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882106.41119: in run() - task 12673a56-9f93-ed48-b3a5-000000001386 7557 1726882106.41127: variable 'ansible_search_path' from source: unknown 7557 1726882106.41136: variable 'ansible_search_path' from source: unknown 7557 1726882106.41227: calling self._execute() 7557 1726882106.41284: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.41300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.41314: variable 'omit' from source: magic vars 7557 1726882106.41692: variable 'ansible_distribution_major_version' from source: facts 7557 1726882106.41711: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882106.41720: variable 'omit' from source: magic vars 7557 1726882106.41795: variable 'omit' from source: magic vars 7557 1726882106.41831: variable 'omit' from source: magic vars 7557 1726882106.41879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882106.42098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882106.42101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882106.42103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882106.42106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882106.42107: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882106.42109: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.42111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.42113: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882106.42114: Set connection var ansible_shell_executable to /bin/sh 7557 1726882106.42116: Set connection var ansible_shell_type to sh 7557 1726882106.42118: Set connection var ansible_pipelining to False 7557 1726882106.42124: Set connection var ansible_connection to ssh 7557 1726882106.42133: Set connection var ansible_timeout to 10 7557 1726882106.42155: variable 'ansible_shell_executable' from source: unknown 7557 1726882106.42162: variable 'ansible_connection' from source: unknown 7557 1726882106.42168: variable 'ansible_module_compression' from source: unknown 7557 1726882106.42173: variable 'ansible_shell_type' from source: unknown 7557 1726882106.42177: variable 'ansible_shell_executable' from source: unknown 7557 1726882106.42182: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882106.42188: variable 'ansible_pipelining' from source: unknown 7557 1726882106.42195: variable 'ansible_timeout' from source: unknown 7557 1726882106.42203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882106.42383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882106.42402: variable 'omit' from source: magic vars 7557 1726882106.42409: starting attempt loop 7557 1726882106.42417: running the handler 7557 1726882106.42437: _low_level_execute_command(): starting 7557 1726882106.42454: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882106.43160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882106.43178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882106.43196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882106.43224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882106.43329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882106.43353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882106.43439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882106.45129: stdout chunk (state=3): >>>/root <<< 7557 1726882106.45283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882106.45287: stdout chunk (state=3): >>><<< 7557 1726882106.45289: stderr chunk (state=3): >>><<< 7557 1726882106.45314: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882106.45413: _low_level_execute_command(): starting 7557 1726882106.45417: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405 `" && echo ansible-tmp-1726882106.4532235-8856-72447788330405="` echo /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405 `" ) && sleep 0' 7557 1726882106.45977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882106.45998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882106.46064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882106.46132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882106.46148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882106.46177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882106.46258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882106.48085: stdout chunk (state=3): >>>ansible-tmp-1726882106.4532235-8856-72447788330405=/root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405 <<< 7557 1726882106.48231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882106.48243: stderr chunk (state=3): >>><<< 7557 1726882106.48256: stdout chunk (state=3): >>><<< 7557 1726882106.48358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882106.4532235-8856-72447788330405=/root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882106.48362: variable 'ansible_module_compression' from source: unknown 7557 1726882106.48381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7557 1726882106.48424: variable 'ansible_facts' from source: unknown 7557 1726882106.48527: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/AnsiballZ_service_facts.py 7557 1726882106.48758: Sending initial data 7557 1726882106.48771: Sent initial data (159 bytes) 7557 1726882106.49314: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882106.49415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882106.49438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882106.49454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882106.49477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882106.49557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882106.51057: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882106.51089: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882106.51165: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpn5er3np0 /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/AnsiballZ_service_facts.py <<< 7557 1726882106.51168: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/AnsiballZ_service_facts.py" <<< 7557 1726882106.51243: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpn5er3np0" to remote "/root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/AnsiballZ_service_facts.py" <<< 7557 1726882106.52049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882106.52100: stderr chunk (state=3): >>><<< 7557 1726882106.52103: stdout chunk (state=3): >>><<< 7557 1726882106.52105: done transferring module to remote 7557 1726882106.52121: _low_level_execute_command(): starting 7557 1726882106.52129: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/ /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/AnsiballZ_service_facts.py && sleep 0' 7557 1726882106.52799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882106.52814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882106.52899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882106.52920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882106.53011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882106.54759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882106.54763: stdout chunk (state=3): >>><<< 7557 1726882106.54770: stderr chunk (state=3): >>><<< 7557 1726882106.54772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882106.54775: _low_level_execute_command(): starting 7557 1726882106.54777: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/AnsiballZ_service_facts.py && sleep 0' 7557 1726882106.55340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882106.55353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882106.55365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882106.55408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882106.55470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882106.55487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882106.55515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882106.55590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.01961: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 7557 1726882108.01980: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 7557 1726882108.02002: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 7557 1726882108.02022: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_sta<<< 7557 1726882108.02051: stdout chunk (state=3): >>>t.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service<<< 7557 1726882108.02061: stdout chunk (state=3): >>>": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7557 1726882108.03484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882108.03515: stderr chunk (state=3): >>><<< 7557 1726882108.03519: stdout chunk (state=3): >>><<< 7557 1726882108.03545: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882108.04215: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882108.04227: _low_level_execute_command(): starting 7557 1726882108.04230: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882106.4532235-8856-72447788330405/ > /dev/null 2>&1 && sleep 0' 7557 1726882108.04670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882108.04673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882108.04675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.04677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882108.04680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.04733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882108.04736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882108.04741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882108.04784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.06560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882108.06564: stdout chunk (state=3): >>><<< 7557 1726882108.06798: stderr chunk (state=3): >>><<< 7557 1726882108.06802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882108.06804: handler run complete 7557 1726882108.06806: variable 'ansible_facts' from source: unknown 7557 1726882108.06956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.08149: variable 'ansible_facts' from source: unknown 7557 1726882108.08479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.08989: attempt loop complete, returning result 7557 1726882108.08995: _execute() done 7557 1726882108.09006: dumping result to json 7557 1726882108.09064: done dumping result, returning 7557 1726882108.09121: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-ed48-b3a5-000000001386] 7557 1726882108.09124: sending task result for task 12673a56-9f93-ed48-b3a5-000000001386 7557 1726882108.11033: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001386 7557 1726882108.11089: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882108.11164: no more pending results, returning what we have 7557 1726882108.11168: results queue empty 7557 1726882108.11169: checking for any_errors_fatal 7557 1726882108.11172: done checking for any_errors_fatal 7557 1726882108.11173: checking for max_fail_percentage 7557 1726882108.11175: done checking for max_fail_percentage 7557 1726882108.11175: checking to see if all hosts have failed and the running result is not ok 7557 1726882108.11176: done checking to see if all hosts have failed 7557 1726882108.11177: getting the remaining hosts for this loop 7557 1726882108.11178: done getting the remaining hosts for this loop 7557 1726882108.11182: getting the next task for host managed_node3 7557 1726882108.11187: done getting next task for host managed_node3 7557 1726882108.11190: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882108.11197: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882108.11209: getting variables 7557 1726882108.11211: in VariableManager get_vars() 7557 1726882108.11253: Calling all_inventory to load vars for managed_node3 7557 1726882108.11369: Calling groups_inventory to load vars for managed_node3 7557 1726882108.11373: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882108.11383: Calling all_plugins_play to load vars for managed_node3 7557 1726882108.11386: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882108.11389: Calling groups_plugins_play to load vars for managed_node3 7557 1726882108.13791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.15499: done with get_vars() 7557 1726882108.15523: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:28:28 -0400 (0:00:01.752) 0:00:34.009 ****** 7557 1726882108.15637: entering _queue_task() for managed_node3/package_facts 7557 1726882108.15943: worker is 1 (out of 1 available) 7557 1726882108.15955: exiting _queue_task() for managed_node3/package_facts 7557 1726882108.15971: done queuing things up, now waiting for results queue to drain 7557 1726882108.15972: waiting for pending results... 7557 1726882108.16161: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882108.16265: in run() - task 12673a56-9f93-ed48-b3a5-000000001387 7557 1726882108.16277: variable 'ansible_search_path' from source: unknown 7557 1726882108.16281: variable 'ansible_search_path' from source: unknown 7557 1726882108.16314: calling self._execute() 7557 1726882108.16390: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.16399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.16408: variable 'omit' from source: magic vars 7557 1726882108.16685: variable 'ansible_distribution_major_version' from source: facts 7557 1726882108.16696: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882108.16705: variable 'omit' from source: magic vars 7557 1726882108.16755: variable 'omit' from source: magic vars 7557 1726882108.16779: variable 'omit' from source: magic vars 7557 1726882108.16815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882108.16843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882108.16860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882108.16874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882108.16885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882108.16915: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882108.16918: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.16921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.16992: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882108.17002: Set connection var ansible_shell_executable to /bin/sh 7557 1726882108.17006: Set connection var ansible_shell_type to sh 7557 1726882108.17010: Set connection var ansible_pipelining to False 7557 1726882108.17014: Set connection var ansible_connection to ssh 7557 1726882108.17016: Set connection var ansible_timeout to 10 7557 1726882108.17035: variable 'ansible_shell_executable' from source: unknown 7557 1726882108.17039: variable 'ansible_connection' from source: unknown 7557 1726882108.17042: variable 'ansible_module_compression' from source: unknown 7557 1726882108.17044: variable 'ansible_shell_type' from source: unknown 7557 1726882108.17047: variable 'ansible_shell_executable' from source: unknown 7557 1726882108.17049: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.17052: variable 'ansible_pipelining' from source: unknown 7557 1726882108.17054: variable 'ansible_timeout' from source: unknown 7557 1726882108.17058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.17207: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882108.17215: variable 'omit' from source: magic vars 7557 1726882108.17218: starting attempt loop 7557 1726882108.17223: running the handler 7557 1726882108.17236: _low_level_execute_command(): starting 7557 1726882108.17245: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882108.18003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.18008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882108.18011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882108.18023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882108.18104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.19699: stdout chunk (state=3): >>>/root <<< 7557 1726882108.19807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882108.19827: stderr chunk (state=3): >>><<< 7557 1726882108.19831: stdout chunk (state=3): >>><<< 7557 1726882108.19850: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882108.19864: _low_level_execute_command(): starting 7557 1726882108.19872: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171 `" && echo ansible-tmp-1726882108.1985106-8919-3257973470171="` echo /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171 `" ) && sleep 0' 7557 1726882108.20311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882108.20318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.20404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882108.20472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.22328: stdout chunk (state=3): >>>ansible-tmp-1726882108.1985106-8919-3257973470171=/root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171 <<< 7557 1726882108.22432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882108.22458: stderr chunk (state=3): >>><<< 7557 1726882108.22461: stdout chunk (state=3): >>><<< 7557 1726882108.22477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882108.1985106-8919-3257973470171=/root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882108.22522: variable 'ansible_module_compression' from source: unknown 7557 1726882108.22561: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7557 1726882108.22614: variable 'ansible_facts' from source: unknown 7557 1726882108.22733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/AnsiballZ_package_facts.py 7557 1726882108.22865: Sending initial data 7557 1726882108.22869: Sent initial data (158 bytes) 7557 1726882108.23509: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.23545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882108.23552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882108.23555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882108.23608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.25117: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7557 1726882108.25129: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882108.25164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882108.25218: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpru5_5728 /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/AnsiballZ_package_facts.py <<< 7557 1726882108.25222: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/AnsiballZ_package_facts.py" <<< 7557 1726882108.25257: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpru5_5728" to remote "/root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/AnsiballZ_package_facts.py" <<< 7557 1726882108.25263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/AnsiballZ_package_facts.py" <<< 7557 1726882108.26499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882108.26502: stderr chunk (state=3): >>><<< 7557 1726882108.26512: stdout chunk (state=3): >>><<< 7557 1726882108.26553: done transferring module to remote 7557 1726882108.26564: _low_level_execute_command(): starting 7557 1726882108.26568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/ /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/AnsiballZ_package_facts.py && sleep 0' 7557 1726882108.27133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882108.27159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882108.27162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882108.27165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882108.27167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.27179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.27236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882108.27240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882108.27290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.28992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882108.29018: stderr chunk (state=3): >>><<< 7557 1726882108.29022: stdout chunk (state=3): >>><<< 7557 1726882108.29038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882108.29041: _low_level_execute_command(): starting 7557 1726882108.29046: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/AnsiballZ_package_facts.py && sleep 0' 7557 1726882108.29475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882108.29479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.29481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882108.29483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882108.29485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.29526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882108.29529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882108.29590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.73726: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 7557 1726882108.73748: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 7557 1726882108.73769: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 7557 1726882108.73801: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 7557 1726882108.73819: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 7557 1726882108.73852: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 7557 1726882108.73877: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 7557 1726882108.73897: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 7557 1726882108.73916: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 7557 1726882108.73927: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7557 1726882108.75634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882108.75656: stderr chunk (state=3): >>><<< 7557 1726882108.75659: stdout chunk (state=3): >>><<< 7557 1726882108.75701: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882108.76897: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882108.76912: _low_level_execute_command(): starting 7557 1726882108.76916: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882108.1985106-8919-3257973470171/ > /dev/null 2>&1 && sleep 0' 7557 1726882108.77371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882108.77374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882108.77376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882108.77378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882108.77380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882108.77429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882108.77448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882108.77496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882108.79325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882108.79355: stderr chunk (state=3): >>><<< 7557 1726882108.79358: stdout chunk (state=3): >>><<< 7557 1726882108.79375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882108.79381: handler run complete 7557 1726882108.79907: variable 'ansible_facts' from source: unknown 7557 1726882108.80183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.81233: variable 'ansible_facts' from source: unknown 7557 1726882108.81481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.81862: attempt loop complete, returning result 7557 1726882108.81871: _execute() done 7557 1726882108.81874: dumping result to json 7557 1726882108.81991: done dumping result, returning 7557 1726882108.82003: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-ed48-b3a5-000000001387] 7557 1726882108.82008: sending task result for task 12673a56-9f93-ed48-b3a5-000000001387 7557 1726882108.83352: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001387 7557 1726882108.83356: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882108.83448: no more pending results, returning what we have 7557 1726882108.83450: results queue empty 7557 1726882108.83451: checking for any_errors_fatal 7557 1726882108.83455: done checking for any_errors_fatal 7557 1726882108.83455: checking for max_fail_percentage 7557 1726882108.83456: done checking for max_fail_percentage 7557 1726882108.83457: checking to see if all hosts have failed and the running result is not ok 7557 1726882108.83457: done checking to see if all hosts have failed 7557 1726882108.83458: getting the remaining hosts for this loop 7557 1726882108.83459: done getting the remaining hosts for this loop 7557 1726882108.83461: getting the next task for host managed_node3 7557 1726882108.83466: done getting next task for host managed_node3 7557 1726882108.83469: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882108.83471: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882108.83478: getting variables 7557 1726882108.83479: in VariableManager get_vars() 7557 1726882108.83512: Calling all_inventory to load vars for managed_node3 7557 1726882108.83514: Calling groups_inventory to load vars for managed_node3 7557 1726882108.83516: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882108.83522: Calling all_plugins_play to load vars for managed_node3 7557 1726882108.83524: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882108.83526: Calling groups_plugins_play to load vars for managed_node3 7557 1726882108.84218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.85085: done with get_vars() 7557 1726882108.85106: done getting variables 7557 1726882108.85148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:28:28 -0400 (0:00:00.695) 0:00:34.704 ****** 7557 1726882108.85178: entering _queue_task() for managed_node3/debug 7557 1726882108.85416: worker is 1 (out of 1 available) 7557 1726882108.85429: exiting _queue_task() for managed_node3/debug 7557 1726882108.85441: done queuing things up, now waiting for results queue to drain 7557 1726882108.85442: waiting for pending results... 7557 1726882108.85632: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882108.85733: in run() - task 12673a56-9f93-ed48-b3a5-0000000000b9 7557 1726882108.85748: variable 'ansible_search_path' from source: unknown 7557 1726882108.85751: variable 'ansible_search_path' from source: unknown 7557 1726882108.85780: calling self._execute() 7557 1726882108.85864: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.85867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.85875: variable 'omit' from source: magic vars 7557 1726882108.86159: variable 'ansible_distribution_major_version' from source: facts 7557 1726882108.86169: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882108.86172: variable 'omit' from source: magic vars 7557 1726882108.86214: variable 'omit' from source: magic vars 7557 1726882108.86280: variable 'network_provider' from source: set_fact 7557 1726882108.86299: variable 'omit' from source: magic vars 7557 1726882108.86330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882108.86357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882108.86373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882108.86390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882108.86402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882108.86426: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882108.86429: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.86435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.86507: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882108.86512: Set connection var ansible_shell_executable to /bin/sh 7557 1726882108.86515: Set connection var ansible_shell_type to sh 7557 1726882108.86521: Set connection var ansible_pipelining to False 7557 1726882108.86524: Set connection var ansible_connection to ssh 7557 1726882108.86527: Set connection var ansible_timeout to 10 7557 1726882108.86544: variable 'ansible_shell_executable' from source: unknown 7557 1726882108.86548: variable 'ansible_connection' from source: unknown 7557 1726882108.86550: variable 'ansible_module_compression' from source: unknown 7557 1726882108.86553: variable 'ansible_shell_type' from source: unknown 7557 1726882108.86556: variable 'ansible_shell_executable' from source: unknown 7557 1726882108.86558: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.86560: variable 'ansible_pipelining' from source: unknown 7557 1726882108.86562: variable 'ansible_timeout' from source: unknown 7557 1726882108.86565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.86666: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882108.86675: variable 'omit' from source: magic vars 7557 1726882108.86679: starting attempt loop 7557 1726882108.86683: running the handler 7557 1726882108.86721: handler run complete 7557 1726882108.86731: attempt loop complete, returning result 7557 1726882108.86734: _execute() done 7557 1726882108.86736: dumping result to json 7557 1726882108.86738: done dumping result, returning 7557 1726882108.86746: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-ed48-b3a5-0000000000b9] 7557 1726882108.86749: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000b9 7557 1726882108.86831: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000b9 7557 1726882108.86834: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7557 1726882108.86924: no more pending results, returning what we have 7557 1726882108.86927: results queue empty 7557 1726882108.86928: checking for any_errors_fatal 7557 1726882108.86933: done checking for any_errors_fatal 7557 1726882108.86934: checking for max_fail_percentage 7557 1726882108.86936: done checking for max_fail_percentage 7557 1726882108.86936: checking to see if all hosts have failed and the running result is not ok 7557 1726882108.86937: done checking to see if all hosts have failed 7557 1726882108.86938: getting the remaining hosts for this loop 7557 1726882108.86939: done getting the remaining hosts for this loop 7557 1726882108.86944: getting the next task for host managed_node3 7557 1726882108.86950: done getting next task for host managed_node3 7557 1726882108.86954: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882108.86956: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882108.86966: getting variables 7557 1726882108.86967: in VariableManager get_vars() 7557 1726882108.87009: Calling all_inventory to load vars for managed_node3 7557 1726882108.87012: Calling groups_inventory to load vars for managed_node3 7557 1726882108.87014: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882108.87022: Calling all_plugins_play to load vars for managed_node3 7557 1726882108.87025: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882108.87027: Calling groups_plugins_play to load vars for managed_node3 7557 1726882108.87838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.88708: done with get_vars() 7557 1726882108.88723: done getting variables 7557 1726882108.88761: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:28:28 -0400 (0:00:00.036) 0:00:34.740 ****** 7557 1726882108.88785: entering _queue_task() for managed_node3/fail 7557 1726882108.88987: worker is 1 (out of 1 available) 7557 1726882108.89003: exiting _queue_task() for managed_node3/fail 7557 1726882108.89016: done queuing things up, now waiting for results queue to drain 7557 1726882108.89018: waiting for pending results... 7557 1726882108.89181: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882108.89269: in run() - task 12673a56-9f93-ed48-b3a5-0000000000ba 7557 1726882108.89280: variable 'ansible_search_path' from source: unknown 7557 1726882108.89283: variable 'ansible_search_path' from source: unknown 7557 1726882108.89317: calling self._execute() 7557 1726882108.89391: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.89400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.89408: variable 'omit' from source: magic vars 7557 1726882108.89675: variable 'ansible_distribution_major_version' from source: facts 7557 1726882108.89686: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882108.89768: variable 'network_state' from source: role '' defaults 7557 1726882108.89776: Evaluated conditional (network_state != {}): False 7557 1726882108.89779: when evaluation is False, skipping this task 7557 1726882108.89782: _execute() done 7557 1726882108.89786: dumping result to json 7557 1726882108.89789: done dumping result, returning 7557 1726882108.89802: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-ed48-b3a5-0000000000ba] 7557 1726882108.89806: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ba 7557 1726882108.89884: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ba 7557 1726882108.89888: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882108.89943: no more pending results, returning what we have 7557 1726882108.89947: results queue empty 7557 1726882108.89948: checking for any_errors_fatal 7557 1726882108.89953: done checking for any_errors_fatal 7557 1726882108.89953: checking for max_fail_percentage 7557 1726882108.89955: done checking for max_fail_percentage 7557 1726882108.89956: checking to see if all hosts have failed and the running result is not ok 7557 1726882108.89957: done checking to see if all hosts have failed 7557 1726882108.89957: getting the remaining hosts for this loop 7557 1726882108.89959: done getting the remaining hosts for this loop 7557 1726882108.89962: getting the next task for host managed_node3 7557 1726882108.89967: done getting next task for host managed_node3 7557 1726882108.89971: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882108.89974: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882108.89989: getting variables 7557 1726882108.89990: in VariableManager get_vars() 7557 1726882108.90032: Calling all_inventory to load vars for managed_node3 7557 1726882108.90034: Calling groups_inventory to load vars for managed_node3 7557 1726882108.90036: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882108.90044: Calling all_plugins_play to load vars for managed_node3 7557 1726882108.90046: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882108.90049: Calling groups_plugins_play to load vars for managed_node3 7557 1726882108.90775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.91747: done with get_vars() 7557 1726882108.91763: done getting variables 7557 1726882108.91813: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:28:28 -0400 (0:00:00.030) 0:00:34.771 ****** 7557 1726882108.91836: entering _queue_task() for managed_node3/fail 7557 1726882108.92058: worker is 1 (out of 1 available) 7557 1726882108.92071: exiting _queue_task() for managed_node3/fail 7557 1726882108.92083: done queuing things up, now waiting for results queue to drain 7557 1726882108.92084: waiting for pending results... 7557 1726882108.92267: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882108.92354: in run() - task 12673a56-9f93-ed48-b3a5-0000000000bb 7557 1726882108.92364: variable 'ansible_search_path' from source: unknown 7557 1726882108.92368: variable 'ansible_search_path' from source: unknown 7557 1726882108.92399: calling self._execute() 7557 1726882108.92478: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.92481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.92491: variable 'omit' from source: magic vars 7557 1726882108.92766: variable 'ansible_distribution_major_version' from source: facts 7557 1726882108.92775: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882108.92861: variable 'network_state' from source: role '' defaults 7557 1726882108.92870: Evaluated conditional (network_state != {}): False 7557 1726882108.92873: when evaluation is False, skipping this task 7557 1726882108.92878: _execute() done 7557 1726882108.92881: dumping result to json 7557 1726882108.92883: done dumping result, returning 7557 1726882108.92887: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-ed48-b3a5-0000000000bb] 7557 1726882108.92900: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bb skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882108.93026: no more pending results, returning what we have 7557 1726882108.93030: results queue empty 7557 1726882108.93031: checking for any_errors_fatal 7557 1726882108.93041: done checking for any_errors_fatal 7557 1726882108.93042: checking for max_fail_percentage 7557 1726882108.93044: done checking for max_fail_percentage 7557 1726882108.93044: checking to see if all hosts have failed and the running result is not ok 7557 1726882108.93045: done checking to see if all hosts have failed 7557 1726882108.93046: getting the remaining hosts for this loop 7557 1726882108.93047: done getting the remaining hosts for this loop 7557 1726882108.93051: getting the next task for host managed_node3 7557 1726882108.93057: done getting next task for host managed_node3 7557 1726882108.93060: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882108.93064: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882108.93084: getting variables 7557 1726882108.93085: in VariableManager get_vars() 7557 1726882108.93129: Calling all_inventory to load vars for managed_node3 7557 1726882108.93132: Calling groups_inventory to load vars for managed_node3 7557 1726882108.93134: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882108.93142: Calling all_plugins_play to load vars for managed_node3 7557 1726882108.93145: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882108.93147: Calling groups_plugins_play to load vars for managed_node3 7557 1726882108.93898: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bb 7557 1726882108.93902: WORKER PROCESS EXITING 7557 1726882108.93913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882108.94799: done with get_vars() 7557 1726882108.94815: done getting variables 7557 1726882108.94859: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:28:28 -0400 (0:00:00.030) 0:00:34.801 ****** 7557 1726882108.94882: entering _queue_task() for managed_node3/fail 7557 1726882108.95116: worker is 1 (out of 1 available) 7557 1726882108.95131: exiting _queue_task() for managed_node3/fail 7557 1726882108.95145: done queuing things up, now waiting for results queue to drain 7557 1726882108.95146: waiting for pending results... 7557 1726882108.95324: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882108.95413: in run() - task 12673a56-9f93-ed48-b3a5-0000000000bc 7557 1726882108.95424: variable 'ansible_search_path' from source: unknown 7557 1726882108.95428: variable 'ansible_search_path' from source: unknown 7557 1726882108.95455: calling self._execute() 7557 1726882108.95537: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882108.95542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882108.95551: variable 'omit' from source: magic vars 7557 1726882108.95822: variable 'ansible_distribution_major_version' from source: facts 7557 1726882108.95832: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882108.95956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882108.97456: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882108.97502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882108.97529: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882108.97557: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882108.97577: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882108.97637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882108.97658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882108.97683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882108.97713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882108.97724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882108.97790: variable 'ansible_distribution_major_version' from source: facts 7557 1726882108.97806: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7557 1726882108.97879: variable 'ansible_distribution' from source: facts 7557 1726882108.97891: variable '__network_rh_distros' from source: role '' defaults 7557 1726882108.97903: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7557 1726882108.98061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882108.98077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882108.98097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882108.98127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882108.98137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882108.98169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882108.98184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882108.98206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882108.98235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882108.98245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882108.98273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882108.98291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882108.98311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882108.98339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882108.98349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882108.98552: variable 'network_connections' from source: task vars 7557 1726882108.98561: variable 'interface' from source: play vars 7557 1726882108.98609: variable 'interface' from source: play vars 7557 1726882108.98621: variable 'network_state' from source: role '' defaults 7557 1726882108.98667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882108.98774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882108.98804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882108.98827: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882108.98848: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882108.98880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882108.98900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882108.98921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882108.98939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882108.98965: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7557 1726882108.98968: when evaluation is False, skipping this task 7557 1726882108.98973: _execute() done 7557 1726882108.98976: dumping result to json 7557 1726882108.98978: done dumping result, returning 7557 1726882108.98989: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-ed48-b3a5-0000000000bc] 7557 1726882108.98991: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bc 7557 1726882108.99074: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bc 7557 1726882108.99076: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7557 1726882108.99130: no more pending results, returning what we have 7557 1726882108.99134: results queue empty 7557 1726882108.99135: checking for any_errors_fatal 7557 1726882108.99141: done checking for any_errors_fatal 7557 1726882108.99142: checking for max_fail_percentage 7557 1726882108.99143: done checking for max_fail_percentage 7557 1726882108.99144: checking to see if all hosts have failed and the running result is not ok 7557 1726882108.99145: done checking to see if all hosts have failed 7557 1726882108.99146: getting the remaining hosts for this loop 7557 1726882108.99147: done getting the remaining hosts for this loop 7557 1726882108.99150: getting the next task for host managed_node3 7557 1726882108.99156: done getting next task for host managed_node3 7557 1726882108.99161: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882108.99164: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882108.99184: getting variables 7557 1726882108.99185: in VariableManager get_vars() 7557 1726882108.99233: Calling all_inventory to load vars for managed_node3 7557 1726882108.99236: Calling groups_inventory to load vars for managed_node3 7557 1726882108.99238: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882108.99247: Calling all_plugins_play to load vars for managed_node3 7557 1726882108.99249: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882108.99251: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.00185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.01027: done with get_vars() 7557 1726882109.01044: done getting variables 7557 1726882109.01084: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:28:29 -0400 (0:00:00.062) 0:00:34.864 ****** 7557 1726882109.01108: entering _queue_task() for managed_node3/dnf 7557 1726882109.01331: worker is 1 (out of 1 available) 7557 1726882109.01346: exiting _queue_task() for managed_node3/dnf 7557 1726882109.01358: done queuing things up, now waiting for results queue to drain 7557 1726882109.01359: waiting for pending results... 7557 1726882109.01537: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882109.01628: in run() - task 12673a56-9f93-ed48-b3a5-0000000000bd 7557 1726882109.01638: variable 'ansible_search_path' from source: unknown 7557 1726882109.01641: variable 'ansible_search_path' from source: unknown 7557 1726882109.01667: calling self._execute() 7557 1726882109.01744: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.01748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.01757: variable 'omit' from source: magic vars 7557 1726882109.02027: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.02035: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.02171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882109.03654: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882109.03696: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882109.03724: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882109.03748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882109.03770: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882109.03829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.03848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.03870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.03897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.03911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.03988: variable 'ansible_distribution' from source: facts 7557 1726882109.03992: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.04007: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7557 1726882109.04079: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882109.04165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.04182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.04205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.04231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.04242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.04268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.04284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.04311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.04334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.04344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.04371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.04386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.04407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.04436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.04446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.04548: variable 'network_connections' from source: task vars 7557 1726882109.04557: variable 'interface' from source: play vars 7557 1726882109.04603: variable 'interface' from source: play vars 7557 1726882109.04655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882109.04774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882109.04804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882109.04827: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882109.04850: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882109.04882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882109.04900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882109.04922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.04939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882109.04986: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882109.05139: variable 'network_connections' from source: task vars 7557 1726882109.05143: variable 'interface' from source: play vars 7557 1726882109.05187: variable 'interface' from source: play vars 7557 1726882109.05214: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882109.05218: when evaluation is False, skipping this task 7557 1726882109.05221: _execute() done 7557 1726882109.05224: dumping result to json 7557 1726882109.05226: done dumping result, returning 7557 1726882109.05233: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-0000000000bd] 7557 1726882109.05237: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bd skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882109.05378: no more pending results, returning what we have 7557 1726882109.05382: results queue empty 7557 1726882109.05383: checking for any_errors_fatal 7557 1726882109.05388: done checking for any_errors_fatal 7557 1726882109.05388: checking for max_fail_percentage 7557 1726882109.05390: done checking for max_fail_percentage 7557 1726882109.05391: checking to see if all hosts have failed and the running result is not ok 7557 1726882109.05395: done checking to see if all hosts have failed 7557 1726882109.05396: getting the remaining hosts for this loop 7557 1726882109.05398: done getting the remaining hosts for this loop 7557 1726882109.05401: getting the next task for host managed_node3 7557 1726882109.05408: done getting next task for host managed_node3 7557 1726882109.05413: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882109.05416: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882109.05436: getting variables 7557 1726882109.05437: in VariableManager get_vars() 7557 1726882109.05481: Calling all_inventory to load vars for managed_node3 7557 1726882109.05484: Calling groups_inventory to load vars for managed_node3 7557 1726882109.05486: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882109.05504: Calling all_plugins_play to load vars for managed_node3 7557 1726882109.05507: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882109.05512: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bd 7557 1726882109.05515: WORKER PROCESS EXITING 7557 1726882109.05518: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.06298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.07153: done with get_vars() 7557 1726882109.07169: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882109.07224: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:28:29 -0400 (0:00:00.061) 0:00:34.925 ****** 7557 1726882109.07248: entering _queue_task() for managed_node3/yum 7557 1726882109.07455: worker is 1 (out of 1 available) 7557 1726882109.07470: exiting _queue_task() for managed_node3/yum 7557 1726882109.07482: done queuing things up, now waiting for results queue to drain 7557 1726882109.07484: waiting for pending results... 7557 1726882109.07667: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882109.07757: in run() - task 12673a56-9f93-ed48-b3a5-0000000000be 7557 1726882109.07767: variable 'ansible_search_path' from source: unknown 7557 1726882109.07771: variable 'ansible_search_path' from source: unknown 7557 1726882109.07803: calling self._execute() 7557 1726882109.07878: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.07883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.07892: variable 'omit' from source: magic vars 7557 1726882109.08155: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.08165: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.08286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882109.09992: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882109.10040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882109.10066: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882109.10090: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882109.10115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882109.10171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.10190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.10214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.10243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.10253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.10321: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.10336: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7557 1726882109.10339: when evaluation is False, skipping this task 7557 1726882109.10342: _execute() done 7557 1726882109.10344: dumping result to json 7557 1726882109.10346: done dumping result, returning 7557 1726882109.10353: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-0000000000be] 7557 1726882109.10358: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000be 7557 1726882109.10445: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000be 7557 1726882109.10448: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7557 1726882109.10500: no more pending results, returning what we have 7557 1726882109.10504: results queue empty 7557 1726882109.10505: checking for any_errors_fatal 7557 1726882109.10511: done checking for any_errors_fatal 7557 1726882109.10512: checking for max_fail_percentage 7557 1726882109.10514: done checking for max_fail_percentage 7557 1726882109.10514: checking to see if all hosts have failed and the running result is not ok 7557 1726882109.10515: done checking to see if all hosts have failed 7557 1726882109.10516: getting the remaining hosts for this loop 7557 1726882109.10517: done getting the remaining hosts for this loop 7557 1726882109.10521: getting the next task for host managed_node3 7557 1726882109.10527: done getting next task for host managed_node3 7557 1726882109.10530: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882109.10533: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882109.10553: getting variables 7557 1726882109.10554: in VariableManager get_vars() 7557 1726882109.10601: Calling all_inventory to load vars for managed_node3 7557 1726882109.10603: Calling groups_inventory to load vars for managed_node3 7557 1726882109.10605: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882109.10614: Calling all_plugins_play to load vars for managed_node3 7557 1726882109.10616: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882109.10618: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.15617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.17116: done with get_vars() 7557 1726882109.17147: done getting variables 7557 1726882109.17201: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:28:29 -0400 (0:00:00.099) 0:00:35.025 ****** 7557 1726882109.17233: entering _queue_task() for managed_node3/fail 7557 1726882109.17576: worker is 1 (out of 1 available) 7557 1726882109.17590: exiting _queue_task() for managed_node3/fail 7557 1726882109.17606: done queuing things up, now waiting for results queue to drain 7557 1726882109.17608: waiting for pending results... 7557 1726882109.18016: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882109.18072: in run() - task 12673a56-9f93-ed48-b3a5-0000000000bf 7557 1726882109.18097: variable 'ansible_search_path' from source: unknown 7557 1726882109.18113: variable 'ansible_search_path' from source: unknown 7557 1726882109.18159: calling self._execute() 7557 1726882109.18268: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.18283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.18304: variable 'omit' from source: magic vars 7557 1726882109.18920: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.18940: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.19034: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882109.19167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882109.20901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882109.20905: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882109.20908: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882109.20911: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882109.20913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882109.20959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.20996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.21032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.21077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.21099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.21147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.21174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.21206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.21248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.21267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.21321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.21347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.21374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.21420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.21437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.21621: variable 'network_connections' from source: task vars 7557 1726882109.21638: variable 'interface' from source: play vars 7557 1726882109.21709: variable 'interface' from source: play vars 7557 1726882109.21783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882109.21955: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882109.22001: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882109.22038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882109.22070: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882109.22121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882109.22141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882109.22170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.22203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882109.22264: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882109.22506: variable 'network_connections' from source: task vars 7557 1726882109.22516: variable 'interface' from source: play vars 7557 1726882109.22579: variable 'interface' from source: play vars 7557 1726882109.22620: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882109.22628: when evaluation is False, skipping this task 7557 1726882109.22635: _execute() done 7557 1726882109.22641: dumping result to json 7557 1726882109.22647: done dumping result, returning 7557 1726882109.22658: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-0000000000bf] 7557 1726882109.22669: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bf skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882109.22833: no more pending results, returning what we have 7557 1726882109.22837: results queue empty 7557 1726882109.22838: checking for any_errors_fatal 7557 1726882109.22848: done checking for any_errors_fatal 7557 1726882109.22848: checking for max_fail_percentage 7557 1726882109.22850: done checking for max_fail_percentage 7557 1726882109.22851: checking to see if all hosts have failed and the running result is not ok 7557 1726882109.22851: done checking to see if all hosts have failed 7557 1726882109.22852: getting the remaining hosts for this loop 7557 1726882109.22853: done getting the remaining hosts for this loop 7557 1726882109.22857: getting the next task for host managed_node3 7557 1726882109.22863: done getting next task for host managed_node3 7557 1726882109.22867: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7557 1726882109.22869: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882109.22894: getting variables 7557 1726882109.22896: in VariableManager get_vars() 7557 1726882109.22945: Calling all_inventory to load vars for managed_node3 7557 1726882109.22948: Calling groups_inventory to load vars for managed_node3 7557 1726882109.22950: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882109.22960: Calling all_plugins_play to load vars for managed_node3 7557 1726882109.22963: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882109.22965: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.23510: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000bf 7557 1726882109.23513: WORKER PROCESS EXITING 7557 1726882109.24360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.25237: done with get_vars() 7557 1726882109.25252: done getting variables 7557 1726882109.25299: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:28:29 -0400 (0:00:00.080) 0:00:35.106 ****** 7557 1726882109.25324: entering _queue_task() for managed_node3/package 7557 1726882109.25553: worker is 1 (out of 1 available) 7557 1726882109.25567: exiting _queue_task() for managed_node3/package 7557 1726882109.25580: done queuing things up, now waiting for results queue to drain 7557 1726882109.25581: waiting for pending results... 7557 1726882109.25760: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7557 1726882109.25861: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c0 7557 1726882109.25871: variable 'ansible_search_path' from source: unknown 7557 1726882109.25875: variable 'ansible_search_path' from source: unknown 7557 1726882109.25907: calling self._execute() 7557 1726882109.25991: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.25999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.26008: variable 'omit' from source: magic vars 7557 1726882109.26499: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.26504: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.26602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882109.26865: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882109.26918: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882109.26996: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882109.27045: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882109.27159: variable 'network_packages' from source: role '' defaults 7557 1726882109.27268: variable '__network_provider_setup' from source: role '' defaults 7557 1726882109.27284: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882109.27355: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882109.27369: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882109.27434: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882109.27628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882109.29865: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882109.29931: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882109.29969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882109.30020: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882109.30050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882109.30300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.30304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.30306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.30309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.30311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.30313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.30318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.30348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.30388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.30410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.30645: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882109.30774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.30806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.30834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.30874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.30891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.30998: variable 'ansible_python' from source: facts 7557 1726882109.31027: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882109.31112: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882109.31190: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882109.31323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.31350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.31377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.31426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.31447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.31497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.31533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.31560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.31606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.31626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.31767: variable 'network_connections' from source: task vars 7557 1726882109.31899: variable 'interface' from source: play vars 7557 1726882109.31903: variable 'interface' from source: play vars 7557 1726882109.31952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882109.31985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882109.32025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.32062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882109.32117: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882109.32415: variable 'network_connections' from source: task vars 7557 1726882109.32426: variable 'interface' from source: play vars 7557 1726882109.32527: variable 'interface' from source: play vars 7557 1726882109.32579: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882109.32659: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882109.33049: variable 'network_connections' from source: task vars 7557 1726882109.33052: variable 'interface' from source: play vars 7557 1726882109.33054: variable 'interface' from source: play vars 7557 1726882109.33074: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882109.33160: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882109.33443: variable 'network_connections' from source: task vars 7557 1726882109.33453: variable 'interface' from source: play vars 7557 1726882109.33524: variable 'interface' from source: play vars 7557 1726882109.33595: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882109.33657: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882109.33668: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882109.33733: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882109.33978: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882109.34461: variable 'network_connections' from source: task vars 7557 1726882109.34472: variable 'interface' from source: play vars 7557 1726882109.34536: variable 'interface' from source: play vars 7557 1726882109.34571: variable 'ansible_distribution' from source: facts 7557 1726882109.34574: variable '__network_rh_distros' from source: role '' defaults 7557 1726882109.34577: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.34603: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882109.34789: variable 'ansible_distribution' from source: facts 7557 1726882109.34795: variable '__network_rh_distros' from source: role '' defaults 7557 1726882109.34798: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.34804: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882109.34968: variable 'ansible_distribution' from source: facts 7557 1726882109.35008: variable '__network_rh_distros' from source: role '' defaults 7557 1726882109.35011: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.35036: variable 'network_provider' from source: set_fact 7557 1726882109.35057: variable 'ansible_facts' from source: unknown 7557 1726882109.35751: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7557 1726882109.35768: when evaluation is False, skipping this task 7557 1726882109.35771: _execute() done 7557 1726882109.35797: dumping result to json 7557 1726882109.35800: done dumping result, returning 7557 1726882109.35803: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-ed48-b3a5-0000000000c0] 7557 1726882109.35807: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c0 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7557 1726882109.36066: no more pending results, returning what we have 7557 1726882109.36070: results queue empty 7557 1726882109.36071: checking for any_errors_fatal 7557 1726882109.36078: done checking for any_errors_fatal 7557 1726882109.36079: checking for max_fail_percentage 7557 1726882109.36081: done checking for max_fail_percentage 7557 1726882109.36082: checking to see if all hosts have failed and the running result is not ok 7557 1726882109.36083: done checking to see if all hosts have failed 7557 1726882109.36084: getting the remaining hosts for this loop 7557 1726882109.36085: done getting the remaining hosts for this loop 7557 1726882109.36089: getting the next task for host managed_node3 7557 1726882109.36098: done getting next task for host managed_node3 7557 1726882109.36106: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882109.36109: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882109.36132: getting variables 7557 1726882109.36133: in VariableManager get_vars() 7557 1726882109.36184: Calling all_inventory to load vars for managed_node3 7557 1726882109.36186: Calling groups_inventory to load vars for managed_node3 7557 1726882109.36189: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882109.36330: Calling all_plugins_play to load vars for managed_node3 7557 1726882109.36334: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882109.36338: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.36943: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c0 7557 1726882109.36947: WORKER PROCESS EXITING 7557 1726882109.38186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.41239: done with get_vars() 7557 1726882109.41267: done getting variables 7557 1726882109.41324: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:28:29 -0400 (0:00:00.160) 0:00:35.266 ****** 7557 1726882109.41358: entering _queue_task() for managed_node3/package 7557 1726882109.42200: worker is 1 (out of 1 available) 7557 1726882109.42214: exiting _queue_task() for managed_node3/package 7557 1726882109.42228: done queuing things up, now waiting for results queue to drain 7557 1726882109.42229: waiting for pending results... 7557 1726882109.42800: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882109.43062: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c1 7557 1726882109.43077: variable 'ansible_search_path' from source: unknown 7557 1726882109.43081: variable 'ansible_search_path' from source: unknown 7557 1726882109.43122: calling self._execute() 7557 1726882109.43417: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.43423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.43433: variable 'omit' from source: magic vars 7557 1726882109.44430: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.44434: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.44780: variable 'network_state' from source: role '' defaults 7557 1726882109.44784: Evaluated conditional (network_state != {}): False 7557 1726882109.44786: when evaluation is False, skipping this task 7557 1726882109.44789: _execute() done 7557 1726882109.44796: dumping result to json 7557 1726882109.44799: done dumping result, returning 7557 1726882109.44820: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-ed48-b3a5-0000000000c1] 7557 1726882109.44832: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c1 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882109.45019: no more pending results, returning what we have 7557 1726882109.45023: results queue empty 7557 1726882109.45024: checking for any_errors_fatal 7557 1726882109.45033: done checking for any_errors_fatal 7557 1726882109.45034: checking for max_fail_percentage 7557 1726882109.45036: done checking for max_fail_percentage 7557 1726882109.45037: checking to see if all hosts have failed and the running result is not ok 7557 1726882109.45038: done checking to see if all hosts have failed 7557 1726882109.45039: getting the remaining hosts for this loop 7557 1726882109.45040: done getting the remaining hosts for this loop 7557 1726882109.45043: getting the next task for host managed_node3 7557 1726882109.45050: done getting next task for host managed_node3 7557 1726882109.45053: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882109.45057: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882109.45085: getting variables 7557 1726882109.45087: in VariableManager get_vars() 7557 1726882109.45143: Calling all_inventory to load vars for managed_node3 7557 1726882109.45146: Calling groups_inventory to load vars for managed_node3 7557 1726882109.45148: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882109.45159: Calling all_plugins_play to load vars for managed_node3 7557 1726882109.45162: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882109.45165: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.46522: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c1 7557 1726882109.47100: WORKER PROCESS EXITING 7557 1726882109.47981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.51836: done with get_vars() 7557 1726882109.51861: done getting variables 7557 1726882109.52046: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:28:29 -0400 (0:00:00.107) 0:00:35.373 ****** 7557 1726882109.52083: entering _queue_task() for managed_node3/package 7557 1726882109.53389: worker is 1 (out of 1 available) 7557 1726882109.53405: exiting _queue_task() for managed_node3/package 7557 1726882109.53418: done queuing things up, now waiting for results queue to drain 7557 1726882109.53420: waiting for pending results... 7557 1726882109.54213: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882109.54276: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c2 7557 1726882109.54304: variable 'ansible_search_path' from source: unknown 7557 1726882109.54312: variable 'ansible_search_path' from source: unknown 7557 1726882109.54357: calling self._execute() 7557 1726882109.54652: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.54664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.54739: variable 'omit' from source: magic vars 7557 1726882109.55549: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.55566: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.55689: variable 'network_state' from source: role '' defaults 7557 1726882109.55839: Evaluated conditional (network_state != {}): False 7557 1726882109.55848: when evaluation is False, skipping this task 7557 1726882109.55855: _execute() done 7557 1726882109.55861: dumping result to json 7557 1726882109.55868: done dumping result, returning 7557 1726882109.55878: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-ed48-b3a5-0000000000c2] 7557 1726882109.55905: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c2 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882109.56110: no more pending results, returning what we have 7557 1726882109.56115: results queue empty 7557 1726882109.56117: checking for any_errors_fatal 7557 1726882109.56126: done checking for any_errors_fatal 7557 1726882109.56127: checking for max_fail_percentage 7557 1726882109.56129: done checking for max_fail_percentage 7557 1726882109.56130: checking to see if all hosts have failed and the running result is not ok 7557 1726882109.56130: done checking to see if all hosts have failed 7557 1726882109.56131: getting the remaining hosts for this loop 7557 1726882109.56133: done getting the remaining hosts for this loop 7557 1726882109.56136: getting the next task for host managed_node3 7557 1726882109.56143: done getting next task for host managed_node3 7557 1726882109.56147: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882109.56150: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882109.56405: getting variables 7557 1726882109.56407: in VariableManager get_vars() 7557 1726882109.56459: Calling all_inventory to load vars for managed_node3 7557 1726882109.56462: Calling groups_inventory to load vars for managed_node3 7557 1726882109.56464: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882109.56479: Calling all_plugins_play to load vars for managed_node3 7557 1726882109.56482: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882109.56484: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.57400: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c2 7557 1726882109.57404: WORKER PROCESS EXITING 7557 1726882109.59170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.62359: done with get_vars() 7557 1726882109.62390: done getting variables 7557 1726882109.62655: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:28:29 -0400 (0:00:00.106) 0:00:35.479 ****** 7557 1726882109.62696: entering _queue_task() for managed_node3/service 7557 1726882109.63438: worker is 1 (out of 1 available) 7557 1726882109.63451: exiting _queue_task() for managed_node3/service 7557 1726882109.63464: done queuing things up, now waiting for results queue to drain 7557 1726882109.63466: waiting for pending results... 7557 1726882109.63840: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882109.64058: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c3 7557 1726882109.64164: variable 'ansible_search_path' from source: unknown 7557 1726882109.64173: variable 'ansible_search_path' from source: unknown 7557 1726882109.64216: calling self._execute() 7557 1726882109.64439: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.64452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.64465: variable 'omit' from source: magic vars 7557 1726882109.65253: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.65315: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.65532: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882109.66026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882109.71301: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882109.71402: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882109.71445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882109.71485: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882109.71519: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882109.71675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.72099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.72102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.72104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.72106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.72108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.72110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.72111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.72499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.72502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.72504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.72506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.72508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.72509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.72511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.72854: variable 'network_connections' from source: task vars 7557 1726882109.73298: variable 'interface' from source: play vars 7557 1726882109.73301: variable 'interface' from source: play vars 7557 1726882109.73304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882109.73437: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882109.74407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882109.74444: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882109.74477: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882109.74744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882109.74764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882109.74797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.74830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882109.74900: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882109.75353: variable 'network_connections' from source: task vars 7557 1726882109.75699: variable 'interface' from source: play vars 7557 1726882109.75702: variable 'interface' from source: play vars 7557 1726882109.75714: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882109.75721: when evaluation is False, skipping this task 7557 1726882109.75726: _execute() done 7557 1726882109.75731: dumping result to json 7557 1726882109.75736: done dumping result, returning 7557 1726882109.75746: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-0000000000c3] 7557 1726882109.75753: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c3 7557 1726882109.75858: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c3 7557 1726882109.75874: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882109.75934: no more pending results, returning what we have 7557 1726882109.75937: results queue empty 7557 1726882109.75938: checking for any_errors_fatal 7557 1726882109.75944: done checking for any_errors_fatal 7557 1726882109.75945: checking for max_fail_percentage 7557 1726882109.75947: done checking for max_fail_percentage 7557 1726882109.75947: checking to see if all hosts have failed and the running result is not ok 7557 1726882109.75948: done checking to see if all hosts have failed 7557 1726882109.75949: getting the remaining hosts for this loop 7557 1726882109.75950: done getting the remaining hosts for this loop 7557 1726882109.75953: getting the next task for host managed_node3 7557 1726882109.75960: done getting next task for host managed_node3 7557 1726882109.75963: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882109.75966: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882109.75985: getting variables 7557 1726882109.75986: in VariableManager get_vars() 7557 1726882109.76243: Calling all_inventory to load vars for managed_node3 7557 1726882109.76247: Calling groups_inventory to load vars for managed_node3 7557 1726882109.76249: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882109.76258: Calling all_plugins_play to load vars for managed_node3 7557 1726882109.76261: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882109.76263: Calling groups_plugins_play to load vars for managed_node3 7557 1726882109.79606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882109.82970: done with get_vars() 7557 1726882109.83113: done getting variables 7557 1726882109.83170: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:28:29 -0400 (0:00:00.205) 0:00:35.684 ****** 7557 1726882109.83318: entering _queue_task() for managed_node3/service 7557 1726882109.84101: worker is 1 (out of 1 available) 7557 1726882109.84113: exiting _queue_task() for managed_node3/service 7557 1726882109.84126: done queuing things up, now waiting for results queue to drain 7557 1726882109.84127: waiting for pending results... 7557 1726882109.84488: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882109.84916: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c4 7557 1726882109.84938: variable 'ansible_search_path' from source: unknown 7557 1726882109.84947: variable 'ansible_search_path' from source: unknown 7557 1726882109.85026: calling self._execute() 7557 1726882109.85310: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882109.85324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882109.85339: variable 'omit' from source: magic vars 7557 1726882109.86400: variable 'ansible_distribution_major_version' from source: facts 7557 1726882109.86403: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882109.86406: variable 'network_provider' from source: set_fact 7557 1726882109.86702: variable 'network_state' from source: role '' defaults 7557 1726882109.86706: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7557 1726882109.86711: variable 'omit' from source: magic vars 7557 1726882109.86714: variable 'omit' from source: magic vars 7557 1726882109.86718: variable 'network_service_name' from source: role '' defaults 7557 1726882109.86786: variable 'network_service_name' from source: role '' defaults 7557 1726882109.87107: variable '__network_provider_setup' from source: role '' defaults 7557 1726882109.87118: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882109.87182: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882109.87599: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882109.87603: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882109.87769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882109.92460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882109.92987: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882109.93601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882109.93606: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882109.93608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882109.93611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.93859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.93892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.94042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.94064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.94123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.94308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.94338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.94381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.94510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.94946: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882109.95100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.95276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.95309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.95354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.95377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.95565: variable 'ansible_python' from source: facts 7557 1726882109.95800: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882109.96017: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882109.96021: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882109.96354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.96385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.96420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.96468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.96516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.96682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882109.96723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882109.96752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.96821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882109.96909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882109.97162: variable 'network_connections' from source: task vars 7557 1726882109.97228: variable 'interface' from source: play vars 7557 1726882109.97348: variable 'interface' from source: play vars 7557 1726882109.97711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882109.98213: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882109.98372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882109.98517: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882109.98553: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882109.98664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882109.98770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882109.99066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882109.99069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882109.99072: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882109.99675: variable 'network_connections' from source: task vars 7557 1726882109.99728: variable 'interface' from source: play vars 7557 1726882109.99901: variable 'interface' from source: play vars 7557 1726882109.99972: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882110.00127: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882110.00832: variable 'network_connections' from source: task vars 7557 1726882110.00842: variable 'interface' from source: play vars 7557 1726882110.01026: variable 'interface' from source: play vars 7557 1726882110.01038: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882110.01160: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882110.01763: variable 'network_connections' from source: task vars 7557 1726882110.02003: variable 'interface' from source: play vars 7557 1726882110.02006: variable 'interface' from source: play vars 7557 1726882110.02145: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882110.02243: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882110.02447: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882110.02450: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882110.02901: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882110.03961: variable 'network_connections' from source: task vars 7557 1726882110.03977: variable 'interface' from source: play vars 7557 1726882110.04141: variable 'interface' from source: play vars 7557 1726882110.04158: variable 'ansible_distribution' from source: facts 7557 1726882110.04166: variable '__network_rh_distros' from source: role '' defaults 7557 1726882110.04176: variable 'ansible_distribution_major_version' from source: facts 7557 1726882110.04212: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882110.04600: variable 'ansible_distribution' from source: facts 7557 1726882110.04603: variable '__network_rh_distros' from source: role '' defaults 7557 1726882110.04605: variable 'ansible_distribution_major_version' from source: facts 7557 1726882110.04607: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882110.05063: variable 'ansible_distribution' from source: facts 7557 1726882110.05073: variable '__network_rh_distros' from source: role '' defaults 7557 1726882110.05083: variable 'ansible_distribution_major_version' from source: facts 7557 1726882110.05136: variable 'network_provider' from source: set_fact 7557 1726882110.05231: variable 'omit' from source: magic vars 7557 1726882110.05265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882110.05363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882110.05384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882110.05408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882110.05456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882110.05487: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882110.05559: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.05568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.05878: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882110.05881: Set connection var ansible_shell_executable to /bin/sh 7557 1726882110.05883: Set connection var ansible_shell_type to sh 7557 1726882110.05885: Set connection var ansible_pipelining to False 7557 1726882110.05887: Set connection var ansible_connection to ssh 7557 1726882110.05888: Set connection var ansible_timeout to 10 7557 1726882110.05890: variable 'ansible_shell_executable' from source: unknown 7557 1726882110.05892: variable 'ansible_connection' from source: unknown 7557 1726882110.05897: variable 'ansible_module_compression' from source: unknown 7557 1726882110.05899: variable 'ansible_shell_type' from source: unknown 7557 1726882110.05901: variable 'ansible_shell_executable' from source: unknown 7557 1726882110.05902: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.05904: variable 'ansible_pipelining' from source: unknown 7557 1726882110.05905: variable 'ansible_timeout' from source: unknown 7557 1726882110.05907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.06101: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882110.06317: variable 'omit' from source: magic vars 7557 1726882110.06325: starting attempt loop 7557 1726882110.06328: running the handler 7557 1726882110.06330: variable 'ansible_facts' from source: unknown 7557 1726882110.07941: _low_level_execute_command(): starting 7557 1726882110.08010: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882110.09474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882110.09580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.09700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.09746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.10021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.11690: stdout chunk (state=3): >>>/root <<< 7557 1726882110.11828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882110.11843: stdout chunk (state=3): >>><<< 7557 1726882110.11953: stderr chunk (state=3): >>><<< 7557 1726882110.11956: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882110.11959: _low_level_execute_command(): starting 7557 1726882110.12129: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806 `" && echo ansible-tmp-1726882110.119313-8967-50170773341806="` echo /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806 `" ) && sleep 0' 7557 1726882110.12780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882110.12799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882110.12815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882110.12836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882110.12861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882110.12874: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882110.12888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.12912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882110.12925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882110.12937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882110.12971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.13043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882110.13068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.13309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.13373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.15253: stdout chunk (state=3): >>>ansible-tmp-1726882110.119313-8967-50170773341806=/root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806 <<< 7557 1726882110.15389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882110.15405: stdout chunk (state=3): >>><<< 7557 1726882110.15417: stderr chunk (state=3): >>><<< 7557 1726882110.15480: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882110.119313-8967-50170773341806=/root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882110.15680: variable 'ansible_module_compression' from source: unknown 7557 1726882110.15690: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7557 1726882110.15765: variable 'ansible_facts' from source: unknown 7557 1726882110.16265: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/AnsiballZ_systemd.py 7557 1726882110.16523: Sending initial data 7557 1726882110.16586: Sent initial data (152 bytes) 7557 1726882110.18112: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882110.18168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.18206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.18303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.19791: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7557 1726882110.19810: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882110.20060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882110.20229: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpc_ezzgk5 /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/AnsiballZ_systemd.py <<< 7557 1726882110.20232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/AnsiballZ_systemd.py" <<< 7557 1726882110.20276: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpc_ezzgk5" to remote "/root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/AnsiballZ_systemd.py" <<< 7557 1726882110.20279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/AnsiballZ_systemd.py" <<< 7557 1726882110.24316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882110.24320: stdout chunk (state=3): >>><<< 7557 1726882110.24322: stderr chunk (state=3): >>><<< 7557 1726882110.24323: done transferring module to remote 7557 1726882110.24325: _low_level_execute_command(): starting 7557 1726882110.24328: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/ /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/AnsiballZ_systemd.py && sleep 0' 7557 1726882110.24937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.24942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.25239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.26843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882110.26983: stderr chunk (state=3): >>><<< 7557 1726882110.26987: stdout chunk (state=3): >>><<< 7557 1726882110.27071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882110.27075: _low_level_execute_command(): starting 7557 1726882110.27078: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/AnsiballZ_systemd.py && sleep 0' 7557 1726882110.28348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882110.28602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.28715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.28881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.57519: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9535488", "MemoryPeak": "10055680", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305508864", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "178271000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", <<< 7557 1726882110.57564: stdout chunk (state=3): >>>"MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service network.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7557 1726882110.59438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882110.59448: stdout chunk (state=3): >>><<< 7557 1726882110.59460: stderr chunk (state=3): >>><<< 7557 1726882110.59483: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9535488", "MemoryPeak": "10055680", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305508864", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "178271000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service network.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882110.59691: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882110.59721: _low_level_execute_command(): starting 7557 1726882110.59730: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882110.119313-8967-50170773341806/ > /dev/null 2>&1 && sleep 0' 7557 1726882110.60346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882110.60360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882110.60374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882110.60389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882110.60411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882110.60515: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.60537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.60615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.62602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882110.62606: stdout chunk (state=3): >>><<< 7557 1726882110.62608: stderr chunk (state=3): >>><<< 7557 1726882110.62611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882110.62613: handler run complete 7557 1726882110.62615: attempt loop complete, returning result 7557 1726882110.62617: _execute() done 7557 1726882110.62618: dumping result to json 7557 1726882110.62620: done dumping result, returning 7557 1726882110.62622: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-ed48-b3a5-0000000000c4] 7557 1726882110.62624: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c4 7557 1726882110.62883: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c4 7557 1726882110.62887: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882110.62952: no more pending results, returning what we have 7557 1726882110.62956: results queue empty 7557 1726882110.62957: checking for any_errors_fatal 7557 1726882110.62964: done checking for any_errors_fatal 7557 1726882110.62965: checking for max_fail_percentage 7557 1726882110.62966: done checking for max_fail_percentage 7557 1726882110.62968: checking to see if all hosts have failed and the running result is not ok 7557 1726882110.62969: done checking to see if all hosts have failed 7557 1726882110.62969: getting the remaining hosts for this loop 7557 1726882110.62971: done getting the remaining hosts for this loop 7557 1726882110.62975: getting the next task for host managed_node3 7557 1726882110.62982: done getting next task for host managed_node3 7557 1726882110.62986: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882110.62989: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882110.63004: getting variables 7557 1726882110.63006: in VariableManager get_vars() 7557 1726882110.63055: Calling all_inventory to load vars for managed_node3 7557 1726882110.63059: Calling groups_inventory to load vars for managed_node3 7557 1726882110.63061: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882110.63072: Calling all_plugins_play to load vars for managed_node3 7557 1726882110.63076: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882110.63079: Calling groups_plugins_play to load vars for managed_node3 7557 1726882110.64703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882110.66358: done with get_vars() 7557 1726882110.66390: done getting variables 7557 1726882110.66465: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:28:30 -0400 (0:00:00.832) 0:00:36.517 ****** 7557 1726882110.66505: entering _queue_task() for managed_node3/service 7557 1726882110.66874: worker is 1 (out of 1 available) 7557 1726882110.66887: exiting _queue_task() for managed_node3/service 7557 1726882110.67018: done queuing things up, now waiting for results queue to drain 7557 1726882110.67019: waiting for pending results... 7557 1726882110.67251: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882110.67402: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c5 7557 1726882110.67407: variable 'ansible_search_path' from source: unknown 7557 1726882110.67410: variable 'ansible_search_path' from source: unknown 7557 1726882110.67413: calling self._execute() 7557 1726882110.67525: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.67531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.67540: variable 'omit' from source: magic vars 7557 1726882110.67947: variable 'ansible_distribution_major_version' from source: facts 7557 1726882110.67991: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882110.68077: variable 'network_provider' from source: set_fact 7557 1726882110.68089: Evaluated conditional (network_provider == "nm"): True 7557 1726882110.68181: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882110.68300: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882110.68450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882110.70902: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882110.71080: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882110.71086: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882110.71090: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882110.71098: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882110.71155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882110.71183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882110.71214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882110.71261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882110.71276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882110.71328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882110.71352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882110.71381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882110.71431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882110.71439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882110.71499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882110.71508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882110.71540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882110.71649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882110.71652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882110.71741: variable 'network_connections' from source: task vars 7557 1726882110.71756: variable 'interface' from source: play vars 7557 1726882110.71825: variable 'interface' from source: play vars 7557 1726882110.71926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882110.72082: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882110.72121: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882110.72158: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882110.72201: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882110.72310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882110.72314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882110.72317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882110.72320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882110.72349: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882110.72596: variable 'network_connections' from source: task vars 7557 1726882110.72600: variable 'interface' from source: play vars 7557 1726882110.72666: variable 'interface' from source: play vars 7557 1726882110.72748: Evaluated conditional (__network_wpa_supplicant_required): False 7557 1726882110.72751: when evaluation is False, skipping this task 7557 1726882110.72754: _execute() done 7557 1726882110.72757: dumping result to json 7557 1726882110.72760: done dumping result, returning 7557 1726882110.72762: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-ed48-b3a5-0000000000c5] 7557 1726882110.72774: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c5 7557 1726882110.72844: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c5 7557 1726882110.72848: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7557 1726882110.72926: no more pending results, returning what we have 7557 1726882110.72930: results queue empty 7557 1726882110.72931: checking for any_errors_fatal 7557 1726882110.72953: done checking for any_errors_fatal 7557 1726882110.72954: checking for max_fail_percentage 7557 1726882110.72956: done checking for max_fail_percentage 7557 1726882110.72957: checking to see if all hosts have failed and the running result is not ok 7557 1726882110.72958: done checking to see if all hosts have failed 7557 1726882110.72959: getting the remaining hosts for this loop 7557 1726882110.72964: done getting the remaining hosts for this loop 7557 1726882110.72969: getting the next task for host managed_node3 7557 1726882110.72975: done getting next task for host managed_node3 7557 1726882110.72980: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882110.72983: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882110.73006: getting variables 7557 1726882110.73008: in VariableManager get_vars() 7557 1726882110.73062: Calling all_inventory to load vars for managed_node3 7557 1726882110.73065: Calling groups_inventory to load vars for managed_node3 7557 1726882110.73067: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882110.73306: Calling all_plugins_play to load vars for managed_node3 7557 1726882110.73310: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882110.73314: Calling groups_plugins_play to load vars for managed_node3 7557 1726882110.74784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882110.75712: done with get_vars() 7557 1726882110.75728: done getting variables 7557 1726882110.75769: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:28:30 -0400 (0:00:00.092) 0:00:36.610 ****** 7557 1726882110.75797: entering _queue_task() for managed_node3/service 7557 1726882110.76032: worker is 1 (out of 1 available) 7557 1726882110.76046: exiting _queue_task() for managed_node3/service 7557 1726882110.76060: done queuing things up, now waiting for results queue to drain 7557 1726882110.76061: waiting for pending results... 7557 1726882110.76245: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882110.76342: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c6 7557 1726882110.76353: variable 'ansible_search_path' from source: unknown 7557 1726882110.76357: variable 'ansible_search_path' from source: unknown 7557 1726882110.76385: calling self._execute() 7557 1726882110.76466: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.76469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.76479: variable 'omit' from source: magic vars 7557 1726882110.76881: variable 'ansible_distribution_major_version' from source: facts 7557 1726882110.76884: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882110.77099: variable 'network_provider' from source: set_fact 7557 1726882110.77103: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882110.77105: when evaluation is False, skipping this task 7557 1726882110.77107: _execute() done 7557 1726882110.77108: dumping result to json 7557 1726882110.77110: done dumping result, returning 7557 1726882110.77112: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-ed48-b3a5-0000000000c6] 7557 1726882110.77114: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c6 7557 1726882110.77172: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c6 7557 1726882110.77174: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882110.77217: no more pending results, returning what we have 7557 1726882110.77220: results queue empty 7557 1726882110.77221: checking for any_errors_fatal 7557 1726882110.77226: done checking for any_errors_fatal 7557 1726882110.77227: checking for max_fail_percentage 7557 1726882110.77229: done checking for max_fail_percentage 7557 1726882110.77229: checking to see if all hosts have failed and the running result is not ok 7557 1726882110.77230: done checking to see if all hosts have failed 7557 1726882110.77231: getting the remaining hosts for this loop 7557 1726882110.77232: done getting the remaining hosts for this loop 7557 1726882110.77235: getting the next task for host managed_node3 7557 1726882110.77240: done getting next task for host managed_node3 7557 1726882110.77243: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882110.77246: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882110.77262: getting variables 7557 1726882110.77264: in VariableManager get_vars() 7557 1726882110.77305: Calling all_inventory to load vars for managed_node3 7557 1726882110.77308: Calling groups_inventory to load vars for managed_node3 7557 1726882110.77310: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882110.77318: Calling all_plugins_play to load vars for managed_node3 7557 1726882110.77321: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882110.77323: Calling groups_plugins_play to load vars for managed_node3 7557 1726882110.78318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882110.79275: done with get_vars() 7557 1726882110.79289: done getting variables 7557 1726882110.79332: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:28:30 -0400 (0:00:00.035) 0:00:36.646 ****** 7557 1726882110.79356: entering _queue_task() for managed_node3/copy 7557 1726882110.79561: worker is 1 (out of 1 available) 7557 1726882110.79572: exiting _queue_task() for managed_node3/copy 7557 1726882110.79584: done queuing things up, now waiting for results queue to drain 7557 1726882110.79586: waiting for pending results... 7557 1726882110.79771: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882110.80100: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c7 7557 1726882110.80103: variable 'ansible_search_path' from source: unknown 7557 1726882110.80106: variable 'ansible_search_path' from source: unknown 7557 1726882110.80109: calling self._execute() 7557 1726882110.80112: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.80114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.80117: variable 'omit' from source: magic vars 7557 1726882110.80485: variable 'ansible_distribution_major_version' from source: facts 7557 1726882110.80521: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882110.80626: variable 'network_provider' from source: set_fact 7557 1726882110.80633: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882110.80635: when evaluation is False, skipping this task 7557 1726882110.80638: _execute() done 7557 1726882110.80641: dumping result to json 7557 1726882110.80643: done dumping result, returning 7557 1726882110.80652: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-ed48-b3a5-0000000000c7] 7557 1726882110.80657: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c7 7557 1726882110.80764: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c7 7557 1726882110.80767: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7557 1726882110.80819: no more pending results, returning what we have 7557 1726882110.80824: results queue empty 7557 1726882110.80825: checking for any_errors_fatal 7557 1726882110.80831: done checking for any_errors_fatal 7557 1726882110.80832: checking for max_fail_percentage 7557 1726882110.80834: done checking for max_fail_percentage 7557 1726882110.80835: checking to see if all hosts have failed and the running result is not ok 7557 1726882110.80836: done checking to see if all hosts have failed 7557 1726882110.80837: getting the remaining hosts for this loop 7557 1726882110.80838: done getting the remaining hosts for this loop 7557 1726882110.80841: getting the next task for host managed_node3 7557 1726882110.80848: done getting next task for host managed_node3 7557 1726882110.80851: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882110.80855: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882110.80877: getting variables 7557 1726882110.80879: in VariableManager get_vars() 7557 1726882110.80931: Calling all_inventory to load vars for managed_node3 7557 1726882110.80934: Calling groups_inventory to load vars for managed_node3 7557 1726882110.80937: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882110.80948: Calling all_plugins_play to load vars for managed_node3 7557 1726882110.80951: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882110.80954: Calling groups_plugins_play to load vars for managed_node3 7557 1726882110.81990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882110.82857: done with get_vars() 7557 1726882110.82878: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:28:30 -0400 (0:00:00.036) 0:00:36.682 ****** 7557 1726882110.82966: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882110.83251: worker is 1 (out of 1 available) 7557 1726882110.83263: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882110.83275: done queuing things up, now waiting for results queue to drain 7557 1726882110.83276: waiting for pending results... 7557 1726882110.83584: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882110.83703: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c8 7557 1726882110.83726: variable 'ansible_search_path' from source: unknown 7557 1726882110.83730: variable 'ansible_search_path' from source: unknown 7557 1726882110.83762: calling self._execute() 7557 1726882110.83870: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.83874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.83896: variable 'omit' from source: magic vars 7557 1726882110.84282: variable 'ansible_distribution_major_version' from source: facts 7557 1726882110.84295: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882110.84304: variable 'omit' from source: magic vars 7557 1726882110.84352: variable 'omit' from source: magic vars 7557 1726882110.84512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882110.86400: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882110.86447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882110.86474: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882110.86501: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882110.86525: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882110.86580: variable 'network_provider' from source: set_fact 7557 1726882110.86676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882110.86956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882110.86977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882110.87009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882110.87021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882110.87077: variable 'omit' from source: magic vars 7557 1726882110.87161: variable 'omit' from source: magic vars 7557 1726882110.87237: variable 'network_connections' from source: task vars 7557 1726882110.87247: variable 'interface' from source: play vars 7557 1726882110.87296: variable 'interface' from source: play vars 7557 1726882110.87429: variable 'omit' from source: magic vars 7557 1726882110.87475: variable '__lsr_ansible_managed' from source: task vars 7557 1726882110.87497: variable '__lsr_ansible_managed' from source: task vars 7557 1726882110.87852: Loaded config def from plugin (lookup/template) 7557 1726882110.87856: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7557 1726882110.87858: File lookup term: get_ansible_managed.j2 7557 1726882110.87860: variable 'ansible_search_path' from source: unknown 7557 1726882110.87863: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7557 1726882110.87866: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7557 1726882110.87869: variable 'ansible_search_path' from source: unknown 7557 1726882110.91277: variable 'ansible_managed' from source: unknown 7557 1726882110.91354: variable 'omit' from source: magic vars 7557 1726882110.91377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882110.91401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882110.91415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882110.91428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882110.91436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882110.91462: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882110.91465: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.91467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.91535: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882110.91541: Set connection var ansible_shell_executable to /bin/sh 7557 1726882110.91544: Set connection var ansible_shell_type to sh 7557 1726882110.91549: Set connection var ansible_pipelining to False 7557 1726882110.91551: Set connection var ansible_connection to ssh 7557 1726882110.91558: Set connection var ansible_timeout to 10 7557 1726882110.91575: variable 'ansible_shell_executable' from source: unknown 7557 1726882110.91577: variable 'ansible_connection' from source: unknown 7557 1726882110.91580: variable 'ansible_module_compression' from source: unknown 7557 1726882110.91582: variable 'ansible_shell_type' from source: unknown 7557 1726882110.91584: variable 'ansible_shell_executable' from source: unknown 7557 1726882110.91587: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882110.91591: variable 'ansible_pipelining' from source: unknown 7557 1726882110.91595: variable 'ansible_timeout' from source: unknown 7557 1726882110.91602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882110.91694: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882110.91706: variable 'omit' from source: magic vars 7557 1726882110.91712: starting attempt loop 7557 1726882110.91715: running the handler 7557 1726882110.91726: _low_level_execute_command(): starting 7557 1726882110.91733: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882110.92244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882110.92248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.92251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882110.92254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.92300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882110.92303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.92310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.92363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.94028: stdout chunk (state=3): >>>/root <<< 7557 1726882110.94128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882110.94154: stderr chunk (state=3): >>><<< 7557 1726882110.94157: stdout chunk (state=3): >>><<< 7557 1726882110.94179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882110.94190: _low_level_execute_command(): starting 7557 1726882110.94198: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175 `" && echo ansible-tmp-1726882110.941791-9005-280079935582175="` echo /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175 `" ) && sleep 0' 7557 1726882110.94597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882110.94635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882110.94638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.94640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882110.94642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882110.94644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.94690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882110.94699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.94701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.94744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.96614: stdout chunk (state=3): >>>ansible-tmp-1726882110.941791-9005-280079935582175=/root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175 <<< 7557 1726882110.96717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882110.96744: stderr chunk (state=3): >>><<< 7557 1726882110.96747: stdout chunk (state=3): >>><<< 7557 1726882110.96763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882110.941791-9005-280079935582175=/root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882110.96808: variable 'ansible_module_compression' from source: unknown 7557 1726882110.96846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7557 1726882110.96870: variable 'ansible_facts' from source: unknown 7557 1726882110.96940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/AnsiballZ_network_connections.py 7557 1726882110.97042: Sending initial data 7557 1726882110.97045: Sent initial data (165 bytes) 7557 1726882110.97470: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882110.97478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882110.97507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882110.97511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882110.97565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882110.97571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882110.97620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882110.99149: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7557 1726882110.99153: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882110.99191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882110.99238: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpoexgxd4o /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/AnsiballZ_network_connections.py <<< 7557 1726882110.99243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/AnsiballZ_network_connections.py" <<< 7557 1726882110.99291: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpoexgxd4o" to remote "/root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/AnsiballZ_network_connections.py" <<< 7557 1726882110.99296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/AnsiballZ_network_connections.py" <<< 7557 1726882111.00021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882111.00064: stderr chunk (state=3): >>><<< 7557 1726882111.00067: stdout chunk (state=3): >>><<< 7557 1726882111.00101: done transferring module to remote 7557 1726882111.00110: _low_level_execute_command(): starting 7557 1726882111.00115: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/ /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/AnsiballZ_network_connections.py && sleep 0' 7557 1726882111.00562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882111.00566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882111.00568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.00570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882111.00572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.00626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882111.00629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882111.00632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.00679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.02387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882111.02417: stderr chunk (state=3): >>><<< 7557 1726882111.02422: stdout chunk (state=3): >>><<< 7557 1726882111.02443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882111.02446: _low_level_execute_command(): starting 7557 1726882111.02450: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/AnsiballZ_network_connections.py && sleep 0' 7557 1726882111.02879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882111.02887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882111.02915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.02918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882111.02920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.02976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882111.02982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882111.02985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.03031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.48920: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7557 1726882111.51076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882111.51109: stderr chunk (state=3): >>><<< 7557 1726882111.51112: stdout chunk (state=3): >>><<< 7557 1726882111.51128: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882111.51163: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': False, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882111.51172: _low_level_execute_command(): starting 7557 1726882111.51176: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882110.941791-9005-280079935582175/ > /dev/null 2>&1 && sleep 0' 7557 1726882111.51642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882111.51646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882111.51648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.51650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882111.51652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.51703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882111.51707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882111.51718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.51764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.53547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882111.53575: stderr chunk (state=3): >>><<< 7557 1726882111.53578: stdout chunk (state=3): >>><<< 7557 1726882111.53592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882111.53602: handler run complete 7557 1726882111.53626: attempt loop complete, returning result 7557 1726882111.53629: _execute() done 7557 1726882111.53632: dumping result to json 7557 1726882111.53639: done dumping result, returning 7557 1726882111.53647: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-ed48-b3a5-0000000000c8] 7557 1726882111.53652: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c8 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49 (not-active) 7557 1726882111.53863: no more pending results, returning what we have 7557 1726882111.53867: results queue empty 7557 1726882111.53868: checking for any_errors_fatal 7557 1726882111.53873: done checking for any_errors_fatal 7557 1726882111.53874: checking for max_fail_percentage 7557 1726882111.53876: done checking for max_fail_percentage 7557 1726882111.53877: checking to see if all hosts have failed and the running result is not ok 7557 1726882111.53878: done checking to see if all hosts have failed 7557 1726882111.53878: getting the remaining hosts for this loop 7557 1726882111.53880: done getting the remaining hosts for this loop 7557 1726882111.53883: getting the next task for host managed_node3 7557 1726882111.53888: done getting next task for host managed_node3 7557 1726882111.53891: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882111.53896: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882111.53906: getting variables 7557 1726882111.53908: in VariableManager get_vars() 7557 1726882111.53952: Calling all_inventory to load vars for managed_node3 7557 1726882111.53955: Calling groups_inventory to load vars for managed_node3 7557 1726882111.53957: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882111.53965: Calling all_plugins_play to load vars for managed_node3 7557 1726882111.53968: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882111.53970: Calling groups_plugins_play to load vars for managed_node3 7557 1726882111.54506: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c8 7557 1726882111.54510: WORKER PROCESS EXITING 7557 1726882111.54922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882111.55769: done with get_vars() 7557 1726882111.55786: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:28:31 -0400 (0:00:00.728) 0:00:37.411 ****** 7557 1726882111.55852: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882111.56101: worker is 1 (out of 1 available) 7557 1726882111.56114: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882111.56128: done queuing things up, now waiting for results queue to drain 7557 1726882111.56130: waiting for pending results... 7557 1726882111.56318: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882111.56406: in run() - task 12673a56-9f93-ed48-b3a5-0000000000c9 7557 1726882111.56418: variable 'ansible_search_path' from source: unknown 7557 1726882111.56422: variable 'ansible_search_path' from source: unknown 7557 1726882111.56451: calling self._execute() 7557 1726882111.56534: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.56538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.56548: variable 'omit' from source: magic vars 7557 1726882111.56833: variable 'ansible_distribution_major_version' from source: facts 7557 1726882111.56843: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882111.56930: variable 'network_state' from source: role '' defaults 7557 1726882111.56938: Evaluated conditional (network_state != {}): False 7557 1726882111.56941: when evaluation is False, skipping this task 7557 1726882111.56943: _execute() done 7557 1726882111.56946: dumping result to json 7557 1726882111.56949: done dumping result, returning 7557 1726882111.56956: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-ed48-b3a5-0000000000c9] 7557 1726882111.56961: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c9 7557 1726882111.57054: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000c9 7557 1726882111.57056: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882111.57107: no more pending results, returning what we have 7557 1726882111.57111: results queue empty 7557 1726882111.57112: checking for any_errors_fatal 7557 1726882111.57121: done checking for any_errors_fatal 7557 1726882111.57122: checking for max_fail_percentage 7557 1726882111.57124: done checking for max_fail_percentage 7557 1726882111.57125: checking to see if all hosts have failed and the running result is not ok 7557 1726882111.57125: done checking to see if all hosts have failed 7557 1726882111.57126: getting the remaining hosts for this loop 7557 1726882111.57128: done getting the remaining hosts for this loop 7557 1726882111.57131: getting the next task for host managed_node3 7557 1726882111.57136: done getting next task for host managed_node3 7557 1726882111.57140: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882111.57144: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882111.57168: getting variables 7557 1726882111.57169: in VariableManager get_vars() 7557 1726882111.57214: Calling all_inventory to load vars for managed_node3 7557 1726882111.57217: Calling groups_inventory to load vars for managed_node3 7557 1726882111.57220: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882111.57229: Calling all_plugins_play to load vars for managed_node3 7557 1726882111.57231: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882111.57234: Calling groups_plugins_play to load vars for managed_node3 7557 1726882111.58009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882111.58872: done with get_vars() 7557 1726882111.58887: done getting variables 7557 1726882111.58935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:28:31 -0400 (0:00:00.031) 0:00:37.442 ****** 7557 1726882111.58957: entering _queue_task() for managed_node3/debug 7557 1726882111.59179: worker is 1 (out of 1 available) 7557 1726882111.59197: exiting _queue_task() for managed_node3/debug 7557 1726882111.59210: done queuing things up, now waiting for results queue to drain 7557 1726882111.59211: waiting for pending results... 7557 1726882111.59392: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882111.59490: in run() - task 12673a56-9f93-ed48-b3a5-0000000000ca 7557 1726882111.59509: variable 'ansible_search_path' from source: unknown 7557 1726882111.59513: variable 'ansible_search_path' from source: unknown 7557 1726882111.59541: calling self._execute() 7557 1726882111.59623: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.59626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.59636: variable 'omit' from source: magic vars 7557 1726882111.59912: variable 'ansible_distribution_major_version' from source: facts 7557 1726882111.59923: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882111.59928: variable 'omit' from source: magic vars 7557 1726882111.59963: variable 'omit' from source: magic vars 7557 1726882111.59990: variable 'omit' from source: magic vars 7557 1726882111.60025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882111.60052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882111.60068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882111.60081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882111.60098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882111.60119: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882111.60122: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.60125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.60197: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882111.60201: Set connection var ansible_shell_executable to /bin/sh 7557 1726882111.60204: Set connection var ansible_shell_type to sh 7557 1726882111.60215: Set connection var ansible_pipelining to False 7557 1726882111.60217: Set connection var ansible_connection to ssh 7557 1726882111.60220: Set connection var ansible_timeout to 10 7557 1726882111.60234: variable 'ansible_shell_executable' from source: unknown 7557 1726882111.60237: variable 'ansible_connection' from source: unknown 7557 1726882111.60240: variable 'ansible_module_compression' from source: unknown 7557 1726882111.60242: variable 'ansible_shell_type' from source: unknown 7557 1726882111.60244: variable 'ansible_shell_executable' from source: unknown 7557 1726882111.60246: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.60251: variable 'ansible_pipelining' from source: unknown 7557 1726882111.60253: variable 'ansible_timeout' from source: unknown 7557 1726882111.60257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.60360: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882111.60369: variable 'omit' from source: magic vars 7557 1726882111.60371: starting attempt loop 7557 1726882111.60376: running the handler 7557 1726882111.60472: variable '__network_connections_result' from source: set_fact 7557 1726882111.60516: handler run complete 7557 1726882111.60529: attempt loop complete, returning result 7557 1726882111.60533: _execute() done 7557 1726882111.60536: dumping result to json 7557 1726882111.60539: done dumping result, returning 7557 1726882111.60549: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-ed48-b3a5-0000000000ca] 7557 1726882111.60551: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ca 7557 1726882111.60636: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ca 7557 1726882111.60639: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49 (not-active)" ] } 7557 1726882111.60712: no more pending results, returning what we have 7557 1726882111.60715: results queue empty 7557 1726882111.60716: checking for any_errors_fatal 7557 1726882111.60723: done checking for any_errors_fatal 7557 1726882111.60724: checking for max_fail_percentage 7557 1726882111.60725: done checking for max_fail_percentage 7557 1726882111.60726: checking to see if all hosts have failed and the running result is not ok 7557 1726882111.60727: done checking to see if all hosts have failed 7557 1726882111.60728: getting the remaining hosts for this loop 7557 1726882111.60729: done getting the remaining hosts for this loop 7557 1726882111.60732: getting the next task for host managed_node3 7557 1726882111.60738: done getting next task for host managed_node3 7557 1726882111.60741: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882111.60744: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882111.60754: getting variables 7557 1726882111.60756: in VariableManager get_vars() 7557 1726882111.60800: Calling all_inventory to load vars for managed_node3 7557 1726882111.60802: Calling groups_inventory to load vars for managed_node3 7557 1726882111.60804: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882111.60812: Calling all_plugins_play to load vars for managed_node3 7557 1726882111.60815: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882111.60817: Calling groups_plugins_play to load vars for managed_node3 7557 1726882111.61696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882111.62550: done with get_vars() 7557 1726882111.62567: done getting variables 7557 1726882111.62615: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:28:31 -0400 (0:00:00.036) 0:00:37.479 ****** 7557 1726882111.62638: entering _queue_task() for managed_node3/debug 7557 1726882111.62870: worker is 1 (out of 1 available) 7557 1726882111.62884: exiting _queue_task() for managed_node3/debug 7557 1726882111.62900: done queuing things up, now waiting for results queue to drain 7557 1726882111.62901: waiting for pending results... 7557 1726882111.63083: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882111.63172: in run() - task 12673a56-9f93-ed48-b3a5-0000000000cb 7557 1726882111.63184: variable 'ansible_search_path' from source: unknown 7557 1726882111.63187: variable 'ansible_search_path' from source: unknown 7557 1726882111.63226: calling self._execute() 7557 1726882111.63297: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.63301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.63308: variable 'omit' from source: magic vars 7557 1726882111.63576: variable 'ansible_distribution_major_version' from source: facts 7557 1726882111.63586: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882111.63591: variable 'omit' from source: magic vars 7557 1726882111.63631: variable 'omit' from source: magic vars 7557 1726882111.63658: variable 'omit' from source: magic vars 7557 1726882111.63690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882111.63719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882111.63733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882111.63746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882111.63757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882111.63783: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882111.63786: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.63788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.63859: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882111.63864: Set connection var ansible_shell_executable to /bin/sh 7557 1726882111.63867: Set connection var ansible_shell_type to sh 7557 1726882111.63872: Set connection var ansible_pipelining to False 7557 1726882111.63874: Set connection var ansible_connection to ssh 7557 1726882111.63886: Set connection var ansible_timeout to 10 7557 1726882111.63901: variable 'ansible_shell_executable' from source: unknown 7557 1726882111.63904: variable 'ansible_connection' from source: unknown 7557 1726882111.63906: variable 'ansible_module_compression' from source: unknown 7557 1726882111.63909: variable 'ansible_shell_type' from source: unknown 7557 1726882111.63911: variable 'ansible_shell_executable' from source: unknown 7557 1726882111.63913: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.63917: variable 'ansible_pipelining' from source: unknown 7557 1726882111.63919: variable 'ansible_timeout' from source: unknown 7557 1726882111.63924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.64028: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882111.64036: variable 'omit' from source: magic vars 7557 1726882111.64041: starting attempt loop 7557 1726882111.64045: running the handler 7557 1726882111.64080: variable '__network_connections_result' from source: set_fact 7557 1726882111.64138: variable '__network_connections_result' from source: set_fact 7557 1726882111.64229: handler run complete 7557 1726882111.64248: attempt loop complete, returning result 7557 1726882111.64251: _execute() done 7557 1726882111.64253: dumping result to json 7557 1726882111.64258: done dumping result, returning 7557 1726882111.64265: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-ed48-b3a5-0000000000cb] 7557 1726882111.64271: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000cb 7557 1726882111.64358: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000cb 7557 1726882111.64360: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, e060611b-cbb5-4b12-af25-a4b709ff9a49 (not-active)" ] } } 7557 1726882111.64455: no more pending results, returning what we have 7557 1726882111.64458: results queue empty 7557 1726882111.64459: checking for any_errors_fatal 7557 1726882111.64464: done checking for any_errors_fatal 7557 1726882111.64465: checking for max_fail_percentage 7557 1726882111.64466: done checking for max_fail_percentage 7557 1726882111.64467: checking to see if all hosts have failed and the running result is not ok 7557 1726882111.64468: done checking to see if all hosts have failed 7557 1726882111.64468: getting the remaining hosts for this loop 7557 1726882111.64472: done getting the remaining hosts for this loop 7557 1726882111.64475: getting the next task for host managed_node3 7557 1726882111.64480: done getting next task for host managed_node3 7557 1726882111.64484: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882111.64486: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882111.64499: getting variables 7557 1726882111.64501: in VariableManager get_vars() 7557 1726882111.64545: Calling all_inventory to load vars for managed_node3 7557 1726882111.64547: Calling groups_inventory to load vars for managed_node3 7557 1726882111.64549: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882111.64557: Calling all_plugins_play to load vars for managed_node3 7557 1726882111.64560: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882111.64562: Calling groups_plugins_play to load vars for managed_node3 7557 1726882111.65316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882111.66278: done with get_vars() 7557 1726882111.66296: done getting variables 7557 1726882111.66338: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:28:31 -0400 (0:00:00.037) 0:00:37.516 ****** 7557 1726882111.66361: entering _queue_task() for managed_node3/debug 7557 1726882111.66582: worker is 1 (out of 1 available) 7557 1726882111.66599: exiting _queue_task() for managed_node3/debug 7557 1726882111.66613: done queuing things up, now waiting for results queue to drain 7557 1726882111.66614: waiting for pending results... 7557 1726882111.66796: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882111.66889: in run() - task 12673a56-9f93-ed48-b3a5-0000000000cc 7557 1726882111.66904: variable 'ansible_search_path' from source: unknown 7557 1726882111.66909: variable 'ansible_search_path' from source: unknown 7557 1726882111.66942: calling self._execute() 7557 1726882111.67021: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.67025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.67039: variable 'omit' from source: magic vars 7557 1726882111.67312: variable 'ansible_distribution_major_version' from source: facts 7557 1726882111.67321: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882111.67408: variable 'network_state' from source: role '' defaults 7557 1726882111.67416: Evaluated conditional (network_state != {}): False 7557 1726882111.67419: when evaluation is False, skipping this task 7557 1726882111.67422: _execute() done 7557 1726882111.67424: dumping result to json 7557 1726882111.67426: done dumping result, returning 7557 1726882111.67434: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-ed48-b3a5-0000000000cc] 7557 1726882111.67439: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000cc 7557 1726882111.67521: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000cc 7557 1726882111.67524: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7557 1726882111.67571: no more pending results, returning what we have 7557 1726882111.67574: results queue empty 7557 1726882111.67575: checking for any_errors_fatal 7557 1726882111.67584: done checking for any_errors_fatal 7557 1726882111.67585: checking for max_fail_percentage 7557 1726882111.67587: done checking for max_fail_percentage 7557 1726882111.67588: checking to see if all hosts have failed and the running result is not ok 7557 1726882111.67589: done checking to see if all hosts have failed 7557 1726882111.67589: getting the remaining hosts for this loop 7557 1726882111.67591: done getting the remaining hosts for this loop 7557 1726882111.67596: getting the next task for host managed_node3 7557 1726882111.67602: done getting next task for host managed_node3 7557 1726882111.67605: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882111.67608: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882111.67629: getting variables 7557 1726882111.67631: in VariableManager get_vars() 7557 1726882111.67678: Calling all_inventory to load vars for managed_node3 7557 1726882111.67681: Calling groups_inventory to load vars for managed_node3 7557 1726882111.67683: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882111.67691: Calling all_plugins_play to load vars for managed_node3 7557 1726882111.67695: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882111.67699: Calling groups_plugins_play to load vars for managed_node3 7557 1726882111.68440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882111.69284: done with get_vars() 7557 1726882111.69302: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:28:31 -0400 (0:00:00.030) 0:00:37.546 ****** 7557 1726882111.69367: entering _queue_task() for managed_node3/ping 7557 1726882111.69589: worker is 1 (out of 1 available) 7557 1726882111.69604: exiting _queue_task() for managed_node3/ping 7557 1726882111.69617: done queuing things up, now waiting for results queue to drain 7557 1726882111.69620: waiting for pending results... 7557 1726882111.69799: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882111.69881: in run() - task 12673a56-9f93-ed48-b3a5-0000000000cd 7557 1726882111.69894: variable 'ansible_search_path' from source: unknown 7557 1726882111.69898: variable 'ansible_search_path' from source: unknown 7557 1726882111.69926: calling self._execute() 7557 1726882111.70005: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.70010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.70019: variable 'omit' from source: magic vars 7557 1726882111.70291: variable 'ansible_distribution_major_version' from source: facts 7557 1726882111.70305: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882111.70310: variable 'omit' from source: magic vars 7557 1726882111.70350: variable 'omit' from source: magic vars 7557 1726882111.70375: variable 'omit' from source: magic vars 7557 1726882111.70467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882111.70471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882111.70482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882111.70497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882111.70512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882111.70535: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882111.70537: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.70540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.70617: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882111.70620: Set connection var ansible_shell_executable to /bin/sh 7557 1726882111.70623: Set connection var ansible_shell_type to sh 7557 1726882111.70629: Set connection var ansible_pipelining to False 7557 1726882111.70632: Set connection var ansible_connection to ssh 7557 1726882111.70634: Set connection var ansible_timeout to 10 7557 1726882111.70650: variable 'ansible_shell_executable' from source: unknown 7557 1726882111.70653: variable 'ansible_connection' from source: unknown 7557 1726882111.70655: variable 'ansible_module_compression' from source: unknown 7557 1726882111.70658: variable 'ansible_shell_type' from source: unknown 7557 1726882111.70660: variable 'ansible_shell_executable' from source: unknown 7557 1726882111.70662: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882111.70666: variable 'ansible_pipelining' from source: unknown 7557 1726882111.70668: variable 'ansible_timeout' from source: unknown 7557 1726882111.70672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882111.70825: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882111.70835: variable 'omit' from source: magic vars 7557 1726882111.70838: starting attempt loop 7557 1726882111.70842: running the handler 7557 1726882111.70853: _low_level_execute_command(): starting 7557 1726882111.70860: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882111.71374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882111.71378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.71381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882111.71385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.71435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882111.71439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882111.71441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.71492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.73064: stdout chunk (state=3): >>>/root <<< 7557 1726882111.73167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882111.73190: stderr chunk (state=3): >>><<< 7557 1726882111.73198: stdout chunk (state=3): >>><<< 7557 1726882111.73216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882111.73228: _low_level_execute_command(): starting 7557 1726882111.73233: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142 `" && echo ansible-tmp-1726882111.732153-9034-37892677749142="` echo /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142 `" ) && sleep 0' 7557 1726882111.73651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882111.73654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882111.73656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882111.73665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882111.73667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.73706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882111.73722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.73765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.75636: stdout chunk (state=3): >>>ansible-tmp-1726882111.732153-9034-37892677749142=/root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142 <<< 7557 1726882111.75774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882111.75784: stderr chunk (state=3): >>><<< 7557 1726882111.75797: stdout chunk (state=3): >>><<< 7557 1726882111.75846: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882111.732153-9034-37892677749142=/root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882111.75876: variable 'ansible_module_compression' from source: unknown 7557 1726882111.75924: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7557 1726882111.75970: variable 'ansible_facts' from source: unknown 7557 1726882111.76069: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/AnsiballZ_ping.py 7557 1726882111.76314: Sending initial data 7557 1726882111.76317: Sent initial data (149 bytes) 7557 1726882111.76853: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882111.76902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.76937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.78479: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882111.78528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882111.78579: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsbsc8500 /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/AnsiballZ_ping.py <<< 7557 1726882111.78583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/AnsiballZ_ping.py" <<< 7557 1726882111.78659: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsbsc8500" to remote "/root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/AnsiballZ_ping.py" <<< 7557 1726882111.79372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882111.79561: stderr chunk (state=3): >>><<< 7557 1726882111.79564: stdout chunk (state=3): >>><<< 7557 1726882111.79566: done transferring module to remote 7557 1726882111.79569: _low_level_execute_command(): starting 7557 1726882111.79572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/ /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/AnsiballZ_ping.py && sleep 0' 7557 1726882111.80182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882111.80205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882111.80307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882111.80333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882111.80346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.80418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.82413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882111.82417: stdout chunk (state=3): >>><<< 7557 1726882111.82421: stderr chunk (state=3): >>><<< 7557 1726882111.82430: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882111.82438: _low_level_execute_command(): starting 7557 1726882111.82448: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/AnsiballZ_ping.py && sleep 0' 7557 1726882111.83085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882111.83105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882111.83120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882111.83138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882111.83155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882111.83177: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882111.83284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882111.83304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882111.83325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882111.83422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882111.98037: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7557 1726882111.99448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882111.99832: stderr chunk (state=3): >>><<< 7557 1726882111.99836: stdout chunk (state=3): >>><<< 7557 1726882111.99839: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882111.99841: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882111.99843: _low_level_execute_command(): starting 7557 1726882111.99846: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882111.732153-9034-37892677749142/ > /dev/null 2>&1 && sleep 0' 7557 1726882112.00596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882112.00600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882112.00603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.00620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882112.00623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.00676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882112.00689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882112.00746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882112.02703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882112.02714: stdout chunk (state=3): >>><<< 7557 1726882112.02770: stderr chunk (state=3): >>><<< 7557 1726882112.02790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882112.02816: handler run complete 7557 1726882112.02883: attempt loop complete, returning result 7557 1726882112.02901: _execute() done 7557 1726882112.02909: dumping result to json 7557 1726882112.02917: done dumping result, returning 7557 1726882112.02931: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-ed48-b3a5-0000000000cd] 7557 1726882112.02979: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000cd 7557 1726882112.03400: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000cd 7557 1726882112.03403: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7557 1726882112.03507: no more pending results, returning what we have 7557 1726882112.03511: results queue empty 7557 1726882112.03519: checking for any_errors_fatal 7557 1726882112.03525: done checking for any_errors_fatal 7557 1726882112.03526: checking for max_fail_percentage 7557 1726882112.03529: done checking for max_fail_percentage 7557 1726882112.03530: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.03531: done checking to see if all hosts have failed 7557 1726882112.03531: getting the remaining hosts for this loop 7557 1726882112.03533: done getting the remaining hosts for this loop 7557 1726882112.03536: getting the next task for host managed_node3 7557 1726882112.03547: done getting next task for host managed_node3 7557 1726882112.03549: ^ task is: TASK: meta (role_complete) 7557 1726882112.03553: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.03566: getting variables 7557 1726882112.03567: in VariableManager get_vars() 7557 1726882112.03927: Calling all_inventory to load vars for managed_node3 7557 1726882112.03930: Calling groups_inventory to load vars for managed_node3 7557 1726882112.03933: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.03942: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.03945: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.03947: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.05666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.07383: done with get_vars() 7557 1726882112.07413: done getting variables 7557 1726882112.07494: done queuing things up, now waiting for results queue to drain 7557 1726882112.07497: results queue empty 7557 1726882112.07498: checking for any_errors_fatal 7557 1726882112.07501: done checking for any_errors_fatal 7557 1726882112.07501: checking for max_fail_percentage 7557 1726882112.07502: done checking for max_fail_percentage 7557 1726882112.07503: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.07504: done checking to see if all hosts have failed 7557 1726882112.07504: getting the remaining hosts for this loop 7557 1726882112.07505: done getting the remaining hosts for this loop 7557 1726882112.07557: getting the next task for host managed_node3 7557 1726882112.07562: done getting next task for host managed_node3 7557 1726882112.07564: ^ task is: TASK: Include the task 'assert_device_present.yml' 7557 1726882112.07566: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.07569: getting variables 7557 1726882112.07570: in VariableManager get_vars() 7557 1726882112.07589: Calling all_inventory to load vars for managed_node3 7557 1726882112.07591: Calling groups_inventory to load vars for managed_node3 7557 1726882112.07595: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.07601: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.07603: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.07606: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.08797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.10691: done with get_vars() 7557 1726882112.10811: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:108 Friday 20 September 2024 21:28:32 -0400 (0:00:00.415) 0:00:37.962 ****** 7557 1726882112.10924: entering _queue_task() for managed_node3/include_tasks 7557 1726882112.11469: worker is 1 (out of 1 available) 7557 1726882112.11481: exiting _queue_task() for managed_node3/include_tasks 7557 1726882112.11496: done queuing things up, now waiting for results queue to drain 7557 1726882112.11498: waiting for pending results... 7557 1726882112.12027: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7557 1726882112.12141: in run() - task 12673a56-9f93-ed48-b3a5-0000000000fd 7557 1726882112.12171: variable 'ansible_search_path' from source: unknown 7557 1726882112.12218: calling self._execute() 7557 1726882112.12376: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.12380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.12383: variable 'omit' from source: magic vars 7557 1726882112.12917: variable 'ansible_distribution_major_version' from source: facts 7557 1726882112.12923: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882112.12926: _execute() done 7557 1726882112.12929: dumping result to json 7557 1726882112.12931: done dumping result, returning 7557 1726882112.12934: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [12673a56-9f93-ed48-b3a5-0000000000fd] 7557 1726882112.12937: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000fd 7557 1726882112.13013: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000fd 7557 1726882112.13016: WORKER PROCESS EXITING 7557 1726882112.13048: no more pending results, returning what we have 7557 1726882112.13054: in VariableManager get_vars() 7557 1726882112.13121: Calling all_inventory to load vars for managed_node3 7557 1726882112.13124: Calling groups_inventory to load vars for managed_node3 7557 1726882112.13128: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.13142: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.13146: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.13149: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.16003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.18454: done with get_vars() 7557 1726882112.18474: variable 'ansible_search_path' from source: unknown 7557 1726882112.18486: we have included files to process 7557 1726882112.18487: generating all_blocks data 7557 1726882112.18490: done generating all_blocks data 7557 1726882112.18496: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882112.18498: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882112.18500: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7557 1726882112.18581: in VariableManager get_vars() 7557 1726882112.18607: done with get_vars() 7557 1726882112.18682: done processing included file 7557 1726882112.18684: iterating over new_blocks loaded from include file 7557 1726882112.18685: in VariableManager get_vars() 7557 1726882112.18702: done with get_vars() 7557 1726882112.18703: filtering new block on tags 7557 1726882112.18718: done filtering new block on tags 7557 1726882112.18720: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7557 1726882112.18724: extending task lists for all hosts with included blocks 7557 1726882112.22769: done extending task lists 7557 1726882112.22771: done processing included files 7557 1726882112.22772: results queue empty 7557 1726882112.22772: checking for any_errors_fatal 7557 1726882112.22773: done checking for any_errors_fatal 7557 1726882112.22774: checking for max_fail_percentage 7557 1726882112.22774: done checking for max_fail_percentage 7557 1726882112.22775: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.22776: done checking to see if all hosts have failed 7557 1726882112.22776: getting the remaining hosts for this loop 7557 1726882112.22777: done getting the remaining hosts for this loop 7557 1726882112.22779: getting the next task for host managed_node3 7557 1726882112.22783: done getting next task for host managed_node3 7557 1726882112.22785: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7557 1726882112.22787: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.22789: getting variables 7557 1726882112.22790: in VariableManager get_vars() 7557 1726882112.22809: Calling all_inventory to load vars for managed_node3 7557 1726882112.22811: Calling groups_inventory to load vars for managed_node3 7557 1726882112.22812: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.22817: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.22818: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.22821: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.28199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.29530: done with get_vars() 7557 1726882112.29561: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:28:32 -0400 (0:00:00.187) 0:00:38.149 ****** 7557 1726882112.29648: entering _queue_task() for managed_node3/include_tasks 7557 1726882112.29925: worker is 1 (out of 1 available) 7557 1726882112.29939: exiting _queue_task() for managed_node3/include_tasks 7557 1726882112.29953: done queuing things up, now waiting for results queue to drain 7557 1726882112.29954: waiting for pending results... 7557 1726882112.30138: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7557 1726882112.30224: in run() - task 12673a56-9f93-ed48-b3a5-00000000143a 7557 1726882112.30237: variable 'ansible_search_path' from source: unknown 7557 1726882112.30241: variable 'ansible_search_path' from source: unknown 7557 1726882112.30269: calling self._execute() 7557 1726882112.30352: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.30356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.30366: variable 'omit' from source: magic vars 7557 1726882112.30656: variable 'ansible_distribution_major_version' from source: facts 7557 1726882112.30666: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882112.30669: _execute() done 7557 1726882112.30674: dumping result to json 7557 1726882112.30677: done dumping result, returning 7557 1726882112.30683: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-ed48-b3a5-00000000143a] 7557 1726882112.30687: sending task result for task 12673a56-9f93-ed48-b3a5-00000000143a 7557 1726882112.30774: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000143a 7557 1726882112.30777: WORKER PROCESS EXITING 7557 1726882112.30809: no more pending results, returning what we have 7557 1726882112.30815: in VariableManager get_vars() 7557 1726882112.30871: Calling all_inventory to load vars for managed_node3 7557 1726882112.30873: Calling groups_inventory to load vars for managed_node3 7557 1726882112.30876: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.30890: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.30892: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.30897: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.31845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.32963: done with get_vars() 7557 1726882112.32976: variable 'ansible_search_path' from source: unknown 7557 1726882112.32977: variable 'ansible_search_path' from source: unknown 7557 1726882112.33006: we have included files to process 7557 1726882112.33007: generating all_blocks data 7557 1726882112.33008: done generating all_blocks data 7557 1726882112.33009: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882112.33010: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882112.33012: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7557 1726882112.33131: done processing included file 7557 1726882112.33132: iterating over new_blocks loaded from include file 7557 1726882112.33134: in VariableManager get_vars() 7557 1726882112.33149: done with get_vars() 7557 1726882112.33150: filtering new block on tags 7557 1726882112.33160: done filtering new block on tags 7557 1726882112.33161: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7557 1726882112.33165: extending task lists for all hosts with included blocks 7557 1726882112.33233: done extending task lists 7557 1726882112.33234: done processing included files 7557 1726882112.33235: results queue empty 7557 1726882112.33235: checking for any_errors_fatal 7557 1726882112.33238: done checking for any_errors_fatal 7557 1726882112.33238: checking for max_fail_percentage 7557 1726882112.33239: done checking for max_fail_percentage 7557 1726882112.33240: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.33240: done checking to see if all hosts have failed 7557 1726882112.33240: getting the remaining hosts for this loop 7557 1726882112.33241: done getting the remaining hosts for this loop 7557 1726882112.33243: getting the next task for host managed_node3 7557 1726882112.33245: done getting next task for host managed_node3 7557 1726882112.33247: ^ task is: TASK: Get stat for interface {{ interface }} 7557 1726882112.33248: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.33250: getting variables 7557 1726882112.33251: in VariableManager get_vars() 7557 1726882112.33262: Calling all_inventory to load vars for managed_node3 7557 1726882112.33263: Calling groups_inventory to load vars for managed_node3 7557 1726882112.33265: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.33268: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.33270: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.33271: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.34242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.35816: done with get_vars() 7557 1726882112.35840: done getting variables 7557 1726882112.36038: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:28:32 -0400 (0:00:00.064) 0:00:38.213 ****** 7557 1726882112.36079: entering _queue_task() for managed_node3/stat 7557 1726882112.36430: worker is 1 (out of 1 available) 7557 1726882112.36442: exiting _queue_task() for managed_node3/stat 7557 1726882112.36455: done queuing things up, now waiting for results queue to drain 7557 1726882112.36456: waiting for pending results... 7557 1726882112.37015: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7557 1726882112.37020: in run() - task 12673a56-9f93-ed48-b3a5-0000000016ba 7557 1726882112.37024: variable 'ansible_search_path' from source: unknown 7557 1726882112.37027: variable 'ansible_search_path' from source: unknown 7557 1726882112.37029: calling self._execute() 7557 1726882112.37088: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.37092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.37108: variable 'omit' from source: magic vars 7557 1726882112.37540: variable 'ansible_distribution_major_version' from source: facts 7557 1726882112.37552: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882112.37559: variable 'omit' from source: magic vars 7557 1726882112.37700: variable 'omit' from source: magic vars 7557 1726882112.37734: variable 'interface' from source: play vars 7557 1726882112.37753: variable 'omit' from source: magic vars 7557 1726882112.37796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882112.37844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882112.37866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882112.37887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882112.37904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882112.37944: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882112.37948: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.37951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.38073: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882112.38081: Set connection var ansible_shell_executable to /bin/sh 7557 1726882112.38085: Set connection var ansible_shell_type to sh 7557 1726882112.38100: Set connection var ansible_pipelining to False 7557 1726882112.38103: Set connection var ansible_connection to ssh 7557 1726882112.38105: Set connection var ansible_timeout to 10 7557 1726882112.38212: variable 'ansible_shell_executable' from source: unknown 7557 1726882112.38216: variable 'ansible_connection' from source: unknown 7557 1726882112.38219: variable 'ansible_module_compression' from source: unknown 7557 1726882112.38221: variable 'ansible_shell_type' from source: unknown 7557 1726882112.38224: variable 'ansible_shell_executable' from source: unknown 7557 1726882112.38227: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.38229: variable 'ansible_pipelining' from source: unknown 7557 1726882112.38232: variable 'ansible_timeout' from source: unknown 7557 1726882112.38234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.38367: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882112.38387: variable 'omit' from source: magic vars 7557 1726882112.38391: starting attempt loop 7557 1726882112.38396: running the handler 7557 1726882112.38415: _low_level_execute_command(): starting 7557 1726882112.38423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882112.38888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882112.38918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882112.38924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.38966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882112.38975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882112.38988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882112.39048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882112.40704: stdout chunk (state=3): >>>/root <<< 7557 1726882112.40802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882112.40835: stderr chunk (state=3): >>><<< 7557 1726882112.40838: stdout chunk (state=3): >>><<< 7557 1726882112.40852: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882112.40907: _low_level_execute_command(): starting 7557 1726882112.40911: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440 `" && echo ansible-tmp-1726882112.4085927-9066-278136837409440="` echo /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440 `" ) && sleep 0' 7557 1726882112.41272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882112.41280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882112.41300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.41322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882112.41325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.41370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882112.41377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882112.41379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882112.41424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882112.43287: stdout chunk (state=3): >>>ansible-tmp-1726882112.4085927-9066-278136837409440=/root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440 <<< 7557 1726882112.43391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882112.43420: stderr chunk (state=3): >>><<< 7557 1726882112.43423: stdout chunk (state=3): >>><<< 7557 1726882112.43440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882112.4085927-9066-278136837409440=/root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882112.43481: variable 'ansible_module_compression' from source: unknown 7557 1726882112.43528: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7557 1726882112.43557: variable 'ansible_facts' from source: unknown 7557 1726882112.43624: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/AnsiballZ_stat.py 7557 1726882112.43728: Sending initial data 7557 1726882112.43731: Sent initial data (151 bytes) 7557 1726882112.44158: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882112.44165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882112.44188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.44191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882112.44196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.44244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882112.44265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882112.44306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882112.45835: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882112.45878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882112.45924: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpn2k7tbzj /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/AnsiballZ_stat.py <<< 7557 1726882112.45931: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/AnsiballZ_stat.py" <<< 7557 1726882112.45973: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpn2k7tbzj" to remote "/root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/AnsiballZ_stat.py" <<< 7557 1726882112.46522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882112.46561: stderr chunk (state=3): >>><<< 7557 1726882112.46565: stdout chunk (state=3): >>><<< 7557 1726882112.46610: done transferring module to remote 7557 1726882112.46618: _low_level_execute_command(): starting 7557 1726882112.46624: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/ /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/AnsiballZ_stat.py && sleep 0' 7557 1726882112.47076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882112.47079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.47082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882112.47084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.47134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882112.47137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882112.47188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882112.48875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882112.48900: stderr chunk (state=3): >>><<< 7557 1726882112.48904: stdout chunk (state=3): >>><<< 7557 1726882112.48917: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882112.48921: _low_level_execute_command(): starting 7557 1726882112.48926: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/AnsiballZ_stat.py && sleep 0' 7557 1726882112.49370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882112.49374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882112.49376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.49379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882112.49381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.49426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882112.49445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882112.49491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882112.64647: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25529, "dev": 23, "nlink": 1, "atime": 1726882104.6754808, "mtime": 1726882104.6754808, "ctime": 1726882104.6754808, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7557 1726882112.65941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882112.65970: stderr chunk (state=3): >>><<< 7557 1726882112.65975: stdout chunk (state=3): >>><<< 7557 1726882112.65991: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25529, "dev": 23, "nlink": 1, "atime": 1726882104.6754808, "mtime": 1726882104.6754808, "ctime": 1726882104.6754808, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882112.66035: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882112.66043: _low_level_execute_command(): starting 7557 1726882112.66048: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882112.4085927-9066-278136837409440/ > /dev/null 2>&1 && sleep 0' 7557 1726882112.66483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882112.66492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882112.66523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882112.66527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882112.66530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882112.66583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882112.66590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882112.66592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882112.66639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882112.68444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882112.68469: stderr chunk (state=3): >>><<< 7557 1726882112.68473: stdout chunk (state=3): >>><<< 7557 1726882112.68485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882112.68491: handler run complete 7557 1726882112.68526: attempt loop complete, returning result 7557 1726882112.68530: _execute() done 7557 1726882112.68533: dumping result to json 7557 1726882112.68535: done dumping result, returning 7557 1726882112.68544: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [12673a56-9f93-ed48-b3a5-0000000016ba] 7557 1726882112.68548: sending task result for task 12673a56-9f93-ed48-b3a5-0000000016ba 7557 1726882112.68651: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000016ba 7557 1726882112.68654: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882104.6754808, "block_size": 4096, "blocks": 0, "ctime": 1726882104.6754808, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25529, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882104.6754808, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7557 1726882112.68737: no more pending results, returning what we have 7557 1726882112.68741: results queue empty 7557 1726882112.68742: checking for any_errors_fatal 7557 1726882112.68743: done checking for any_errors_fatal 7557 1726882112.68744: checking for max_fail_percentage 7557 1726882112.68746: done checking for max_fail_percentage 7557 1726882112.68747: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.68747: done checking to see if all hosts have failed 7557 1726882112.68748: getting the remaining hosts for this loop 7557 1726882112.68749: done getting the remaining hosts for this loop 7557 1726882112.68753: getting the next task for host managed_node3 7557 1726882112.68760: done getting next task for host managed_node3 7557 1726882112.68763: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7557 1726882112.68765: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.68770: getting variables 7557 1726882112.68771: in VariableManager get_vars() 7557 1726882112.68818: Calling all_inventory to load vars for managed_node3 7557 1726882112.68822: Calling groups_inventory to load vars for managed_node3 7557 1726882112.68824: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.68834: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.68837: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.68839: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.69647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.70511: done with get_vars() 7557 1726882112.70529: done getting variables 7557 1726882112.70570: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882112.70659: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:28:32 -0400 (0:00:00.346) 0:00:38.559 ****** 7557 1726882112.70681: entering _queue_task() for managed_node3/assert 7557 1726882112.70904: worker is 1 (out of 1 available) 7557 1726882112.70917: exiting _queue_task() for managed_node3/assert 7557 1726882112.70929: done queuing things up, now waiting for results queue to drain 7557 1726882112.70930: waiting for pending results... 7557 1726882112.71104: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7557 1726882112.71177: in run() - task 12673a56-9f93-ed48-b3a5-00000000143b 7557 1726882112.71191: variable 'ansible_search_path' from source: unknown 7557 1726882112.71197: variable 'ansible_search_path' from source: unknown 7557 1726882112.71226: calling self._execute() 7557 1726882112.71310: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.71314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.71323: variable 'omit' from source: magic vars 7557 1726882112.71594: variable 'ansible_distribution_major_version' from source: facts 7557 1726882112.71604: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882112.71610: variable 'omit' from source: magic vars 7557 1726882112.71640: variable 'omit' from source: magic vars 7557 1726882112.71708: variable 'interface' from source: play vars 7557 1726882112.71729: variable 'omit' from source: magic vars 7557 1726882112.71755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882112.71782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882112.71801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882112.71816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882112.71826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882112.71852: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882112.71855: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.71858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.71930: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882112.71936: Set connection var ansible_shell_executable to /bin/sh 7557 1726882112.71941: Set connection var ansible_shell_type to sh 7557 1726882112.71943: Set connection var ansible_pipelining to False 7557 1726882112.71946: Set connection var ansible_connection to ssh 7557 1726882112.71953: Set connection var ansible_timeout to 10 7557 1726882112.71969: variable 'ansible_shell_executable' from source: unknown 7557 1726882112.71972: variable 'ansible_connection' from source: unknown 7557 1726882112.71975: variable 'ansible_module_compression' from source: unknown 7557 1726882112.71977: variable 'ansible_shell_type' from source: unknown 7557 1726882112.71980: variable 'ansible_shell_executable' from source: unknown 7557 1726882112.71982: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.71984: variable 'ansible_pipelining' from source: unknown 7557 1726882112.71987: variable 'ansible_timeout' from source: unknown 7557 1726882112.71991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.72095: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882112.72106: variable 'omit' from source: magic vars 7557 1726882112.72109: starting attempt loop 7557 1726882112.72112: running the handler 7557 1726882112.72206: variable 'interface_stat' from source: set_fact 7557 1726882112.72219: Evaluated conditional (interface_stat.stat.exists): True 7557 1726882112.72224: handler run complete 7557 1726882112.72234: attempt loop complete, returning result 7557 1726882112.72236: _execute() done 7557 1726882112.72241: dumping result to json 7557 1726882112.72243: done dumping result, returning 7557 1726882112.72248: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [12673a56-9f93-ed48-b3a5-00000000143b] 7557 1726882112.72253: sending task result for task 12673a56-9f93-ed48-b3a5-00000000143b 7557 1726882112.72331: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000143b 7557 1726882112.72334: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882112.72397: no more pending results, returning what we have 7557 1726882112.72401: results queue empty 7557 1726882112.72402: checking for any_errors_fatal 7557 1726882112.72410: done checking for any_errors_fatal 7557 1726882112.72410: checking for max_fail_percentage 7557 1726882112.72412: done checking for max_fail_percentage 7557 1726882112.72413: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.72413: done checking to see if all hosts have failed 7557 1726882112.72414: getting the remaining hosts for this loop 7557 1726882112.72416: done getting the remaining hosts for this loop 7557 1726882112.72419: getting the next task for host managed_node3 7557 1726882112.72425: done getting next task for host managed_node3 7557 1726882112.72428: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7557 1726882112.72429: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.72432: getting variables 7557 1726882112.72434: in VariableManager get_vars() 7557 1726882112.72476: Calling all_inventory to load vars for managed_node3 7557 1726882112.72478: Calling groups_inventory to load vars for managed_node3 7557 1726882112.72480: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.72489: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.72492: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.72496: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.73345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.74186: done with get_vars() 7557 1726882112.74203: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:110 Friday 20 September 2024 21:28:32 -0400 (0:00:00.035) 0:00:38.595 ****** 7557 1726882112.74262: entering _queue_task() for managed_node3/include_tasks 7557 1726882112.74469: worker is 1 (out of 1 available) 7557 1726882112.74483: exiting _queue_task() for managed_node3/include_tasks 7557 1726882112.74499: done queuing things up, now waiting for results queue to drain 7557 1726882112.74500: waiting for pending results... 7557 1726882112.74659: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7557 1726882112.74724: in run() - task 12673a56-9f93-ed48-b3a5-0000000000fe 7557 1726882112.74738: variable 'ansible_search_path' from source: unknown 7557 1726882112.74767: calling self._execute() 7557 1726882112.74851: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.74855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.74865: variable 'omit' from source: magic vars 7557 1726882112.75134: variable 'ansible_distribution_major_version' from source: facts 7557 1726882112.75142: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882112.75148: _execute() done 7557 1726882112.75151: dumping result to json 7557 1726882112.75154: done dumping result, returning 7557 1726882112.75161: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [12673a56-9f93-ed48-b3a5-0000000000fe] 7557 1726882112.75165: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000fe 7557 1726882112.75252: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000fe 7557 1726882112.75255: WORKER PROCESS EXITING 7557 1726882112.75301: no more pending results, returning what we have 7557 1726882112.75305: in VariableManager get_vars() 7557 1726882112.75353: Calling all_inventory to load vars for managed_node3 7557 1726882112.75356: Calling groups_inventory to load vars for managed_node3 7557 1726882112.75358: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.75367: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.75370: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.75372: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.76118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.77245: done with get_vars() 7557 1726882112.77257: variable 'ansible_search_path' from source: unknown 7557 1726882112.77267: we have included files to process 7557 1726882112.77268: generating all_blocks data 7557 1726882112.77269: done generating all_blocks data 7557 1726882112.77272: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7557 1726882112.77273: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7557 1726882112.77274: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7557 1726882112.77358: in VariableManager get_vars() 7557 1726882112.77376: done with get_vars() 7557 1726882112.77543: done processing included file 7557 1726882112.77545: iterating over new_blocks loaded from include file 7557 1726882112.77546: in VariableManager get_vars() 7557 1726882112.77561: done with get_vars() 7557 1726882112.77562: filtering new block on tags 7557 1726882112.77574: done filtering new block on tags 7557 1726882112.77575: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7557 1726882112.77579: extending task lists for all hosts with included blocks 7557 1726882112.82213: done extending task lists 7557 1726882112.82214: done processing included files 7557 1726882112.82215: results queue empty 7557 1726882112.82215: checking for any_errors_fatal 7557 1726882112.82217: done checking for any_errors_fatal 7557 1726882112.82218: checking for max_fail_percentage 7557 1726882112.82219: done checking for max_fail_percentage 7557 1726882112.82220: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.82220: done checking to see if all hosts have failed 7557 1726882112.82221: getting the remaining hosts for this loop 7557 1726882112.82221: done getting the remaining hosts for this loop 7557 1726882112.82223: getting the next task for host managed_node3 7557 1726882112.82226: done getting next task for host managed_node3 7557 1726882112.82227: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7557 1726882112.82229: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.82231: getting variables 7557 1726882112.82232: in VariableManager get_vars() 7557 1726882112.82248: Calling all_inventory to load vars for managed_node3 7557 1726882112.82250: Calling groups_inventory to load vars for managed_node3 7557 1726882112.82251: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.82256: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.82257: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.82259: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.82939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.83971: done with get_vars() 7557 1726882112.83997: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:28:32 -0400 (0:00:00.098) 0:00:38.693 ****** 7557 1726882112.84074: entering _queue_task() for managed_node3/include_tasks 7557 1726882112.84421: worker is 1 (out of 1 available) 7557 1726882112.84435: exiting _queue_task() for managed_node3/include_tasks 7557 1726882112.84449: done queuing things up, now waiting for results queue to drain 7557 1726882112.84451: waiting for pending results... 7557 1726882112.84813: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7557 1726882112.84825: in run() - task 12673a56-9f93-ed48-b3a5-0000000016d2 7557 1726882112.84831: variable 'ansible_search_path' from source: unknown 7557 1726882112.84839: variable 'ansible_search_path' from source: unknown 7557 1726882112.84897: calling self._execute() 7557 1726882112.85008: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.85020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.85036: variable 'omit' from source: magic vars 7557 1726882112.85449: variable 'ansible_distribution_major_version' from source: facts 7557 1726882112.85480: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882112.85484: _execute() done 7557 1726882112.85488: dumping result to json 7557 1726882112.85589: done dumping result, returning 7557 1726882112.85592: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-ed48-b3a5-0000000016d2] 7557 1726882112.85597: sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d2 7557 1726882112.85664: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d2 7557 1726882112.85667: WORKER PROCESS EXITING 7557 1726882112.85720: no more pending results, returning what we have 7557 1726882112.85726: in VariableManager get_vars() 7557 1726882112.85782: Calling all_inventory to load vars for managed_node3 7557 1726882112.85784: Calling groups_inventory to load vars for managed_node3 7557 1726882112.85787: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.85803: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.85807: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.85810: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.87185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.88892: done with get_vars() 7557 1726882112.88931: variable 'ansible_search_path' from source: unknown 7557 1726882112.88932: variable 'ansible_search_path' from source: unknown 7557 1726882112.88969: we have included files to process 7557 1726882112.88970: generating all_blocks data 7557 1726882112.88972: done generating all_blocks data 7557 1726882112.88973: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7557 1726882112.88974: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7557 1726882112.88976: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7557 1726882112.89943: done processing included file 7557 1726882112.89945: iterating over new_blocks loaded from include file 7557 1726882112.89947: in VariableManager get_vars() 7557 1726882112.89981: done with get_vars() 7557 1726882112.89983: filtering new block on tags 7557 1726882112.90017: done filtering new block on tags 7557 1726882112.90022: in VariableManager get_vars() 7557 1726882112.90044: done with get_vars() 7557 1726882112.90046: filtering new block on tags 7557 1726882112.90063: done filtering new block on tags 7557 1726882112.90065: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7557 1726882112.90070: extending task lists for all hosts with included blocks 7557 1726882112.90210: done extending task lists 7557 1726882112.90211: done processing included files 7557 1726882112.90212: results queue empty 7557 1726882112.90213: checking for any_errors_fatal 7557 1726882112.90216: done checking for any_errors_fatal 7557 1726882112.90217: checking for max_fail_percentage 7557 1726882112.90218: done checking for max_fail_percentage 7557 1726882112.90219: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.90219: done checking to see if all hosts have failed 7557 1726882112.90220: getting the remaining hosts for this loop 7557 1726882112.90221: done getting the remaining hosts for this loop 7557 1726882112.90223: getting the next task for host managed_node3 7557 1726882112.90227: done getting next task for host managed_node3 7557 1726882112.90229: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7557 1726882112.90232: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.90235: getting variables 7557 1726882112.90236: in VariableManager get_vars() 7557 1726882112.90251: Calling all_inventory to load vars for managed_node3 7557 1726882112.90254: Calling groups_inventory to load vars for managed_node3 7557 1726882112.90255: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.90261: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.90263: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.90266: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.91444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.93003: done with get_vars() 7557 1726882112.93024: done getting variables 7557 1726882112.93062: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:28:32 -0400 (0:00:00.090) 0:00:38.783 ****** 7557 1726882112.93092: entering _queue_task() for managed_node3/set_fact 7557 1726882112.93426: worker is 1 (out of 1 available) 7557 1726882112.93439: exiting _queue_task() for managed_node3/set_fact 7557 1726882112.93454: done queuing things up, now waiting for results queue to drain 7557 1726882112.93455: waiting for pending results... 7557 1726882112.93720: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7557 1726882112.93878: in run() - task 12673a56-9f93-ed48-b3a5-00000000195f 7557 1726882112.93883: variable 'ansible_search_path' from source: unknown 7557 1726882112.93885: variable 'ansible_search_path' from source: unknown 7557 1726882112.93986: calling self._execute() 7557 1726882112.94026: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.94038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.94052: variable 'omit' from source: magic vars 7557 1726882112.94432: variable 'ansible_distribution_major_version' from source: facts 7557 1726882112.94447: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882112.94456: variable 'omit' from source: magic vars 7557 1726882112.94501: variable 'omit' from source: magic vars 7557 1726882112.94542: variable 'omit' from source: magic vars 7557 1726882112.94636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882112.94680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882112.94709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882112.94732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882112.94757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882112.94795: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882112.94805: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.94814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.94963: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882112.94967: Set connection var ansible_shell_executable to /bin/sh 7557 1726882112.94970: Set connection var ansible_shell_type to sh 7557 1726882112.94972: Set connection var ansible_pipelining to False 7557 1726882112.94974: Set connection var ansible_connection to ssh 7557 1726882112.94976: Set connection var ansible_timeout to 10 7557 1726882112.94999: variable 'ansible_shell_executable' from source: unknown 7557 1726882112.95008: variable 'ansible_connection' from source: unknown 7557 1726882112.95072: variable 'ansible_module_compression' from source: unknown 7557 1726882112.95076: variable 'ansible_shell_type' from source: unknown 7557 1726882112.95078: variable 'ansible_shell_executable' from source: unknown 7557 1726882112.95080: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.95082: variable 'ansible_pipelining' from source: unknown 7557 1726882112.95085: variable 'ansible_timeout' from source: unknown 7557 1726882112.95087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.95206: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882112.95224: variable 'omit' from source: magic vars 7557 1726882112.95234: starting attempt loop 7557 1726882112.95243: running the handler 7557 1726882112.95262: handler run complete 7557 1726882112.95277: attempt loop complete, returning result 7557 1726882112.95400: _execute() done 7557 1726882112.95405: dumping result to json 7557 1726882112.95409: done dumping result, returning 7557 1726882112.95412: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-ed48-b3a5-00000000195f] 7557 1726882112.95415: sending task result for task 12673a56-9f93-ed48-b3a5-00000000195f 7557 1726882112.95486: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000195f 7557 1726882112.95490: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7557 1726882112.95546: no more pending results, returning what we have 7557 1726882112.95549: results queue empty 7557 1726882112.95550: checking for any_errors_fatal 7557 1726882112.95551: done checking for any_errors_fatal 7557 1726882112.95552: checking for max_fail_percentage 7557 1726882112.95553: done checking for max_fail_percentage 7557 1726882112.95554: checking to see if all hosts have failed and the running result is not ok 7557 1726882112.95555: done checking to see if all hosts have failed 7557 1726882112.95556: getting the remaining hosts for this loop 7557 1726882112.95557: done getting the remaining hosts for this loop 7557 1726882112.95560: getting the next task for host managed_node3 7557 1726882112.95703: done getting next task for host managed_node3 7557 1726882112.95706: ^ task is: TASK: Stat profile file 7557 1726882112.95710: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882112.95714: getting variables 7557 1726882112.95715: in VariableManager get_vars() 7557 1726882112.95758: Calling all_inventory to load vars for managed_node3 7557 1726882112.95761: Calling groups_inventory to load vars for managed_node3 7557 1726882112.95763: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882112.95772: Calling all_plugins_play to load vars for managed_node3 7557 1726882112.95774: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882112.95777: Calling groups_plugins_play to load vars for managed_node3 7557 1726882112.97179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882112.98785: done with get_vars() 7557 1726882112.98815: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:28:32 -0400 (0:00:00.058) 0:00:38.842 ****** 7557 1726882112.98919: entering _queue_task() for managed_node3/stat 7557 1726882112.99275: worker is 1 (out of 1 available) 7557 1726882112.99297: exiting _queue_task() for managed_node3/stat 7557 1726882112.99313: done queuing things up, now waiting for results queue to drain 7557 1726882112.99315: waiting for pending results... 7557 1726882112.99584: running TaskExecutor() for managed_node3/TASK: Stat profile file 7557 1726882112.99782: in run() - task 12673a56-9f93-ed48-b3a5-000000001960 7557 1726882112.99786: variable 'ansible_search_path' from source: unknown 7557 1726882112.99789: variable 'ansible_search_path' from source: unknown 7557 1726882112.99805: calling self._execute() 7557 1726882112.99933: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882112.99954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882112.99981: variable 'omit' from source: magic vars 7557 1726882113.00514: variable 'ansible_distribution_major_version' from source: facts 7557 1726882113.00601: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882113.00611: variable 'omit' from source: magic vars 7557 1726882113.00674: variable 'omit' from source: magic vars 7557 1726882113.00779: variable 'profile' from source: include params 7557 1726882113.00790: variable 'interface' from source: play vars 7557 1726882113.00861: variable 'interface' from source: play vars 7557 1726882113.00887: variable 'omit' from source: magic vars 7557 1726882113.00935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882113.00971: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882113.00992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882113.01098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882113.01101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882113.01104: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882113.01106: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.01108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.01182: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882113.01198: Set connection var ansible_shell_executable to /bin/sh 7557 1726882113.01205: Set connection var ansible_shell_type to sh 7557 1726882113.01213: Set connection var ansible_pipelining to False 7557 1726882113.01220: Set connection var ansible_connection to ssh 7557 1726882113.01233: Set connection var ansible_timeout to 10 7557 1726882113.01258: variable 'ansible_shell_executable' from source: unknown 7557 1726882113.01265: variable 'ansible_connection' from source: unknown 7557 1726882113.01271: variable 'ansible_module_compression' from source: unknown 7557 1726882113.01276: variable 'ansible_shell_type' from source: unknown 7557 1726882113.01282: variable 'ansible_shell_executable' from source: unknown 7557 1726882113.01288: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.01301: variable 'ansible_pipelining' from source: unknown 7557 1726882113.01308: variable 'ansible_timeout' from source: unknown 7557 1726882113.01315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.01522: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882113.01559: variable 'omit' from source: magic vars 7557 1726882113.01562: starting attempt loop 7557 1726882113.01565: running the handler 7557 1726882113.01571: _low_level_execute_command(): starting 7557 1726882113.01582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882113.02300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.02399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.02416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.02439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.02457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.02471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.02562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.04219: stdout chunk (state=3): >>>/root <<< 7557 1726882113.04357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.04363: stdout chunk (state=3): >>><<< 7557 1726882113.04371: stderr chunk (state=3): >>><<< 7557 1726882113.04408: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.04423: _low_level_execute_command(): starting 7557 1726882113.04499: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439 `" && echo ansible-tmp-1726882113.0440748-9083-116980706021439="` echo /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439 `" ) && sleep 0' 7557 1726882113.05092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.05106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.05118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.05132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882113.05146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882113.05165: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882113.05176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.05278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.05295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.05383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.07216: stdout chunk (state=3): >>>ansible-tmp-1726882113.0440748-9083-116980706021439=/root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439 <<< 7557 1726882113.07382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.07386: stdout chunk (state=3): >>><<< 7557 1726882113.07388: stderr chunk (state=3): >>><<< 7557 1726882113.07825: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882113.0440748-9083-116980706021439=/root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.07829: variable 'ansible_module_compression' from source: unknown 7557 1726882113.07831: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7557 1726882113.07833: variable 'ansible_facts' from source: unknown 7557 1726882113.07836: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/AnsiballZ_stat.py 7557 1726882113.07980: Sending initial data 7557 1726882113.07983: Sent initial data (151 bytes) 7557 1726882113.08516: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.08596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.08614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.08649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.08662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.08683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.08764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.10251: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882113.10413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882113.10480: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpe1kgu2ez /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/AnsiballZ_stat.py <<< 7557 1726882113.10485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/AnsiballZ_stat.py" <<< 7557 1726882113.10548: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpe1kgu2ez" to remote "/root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/AnsiballZ_stat.py" <<< 7557 1726882113.11400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.11404: stdout chunk (state=3): >>><<< 7557 1726882113.11406: stderr chunk (state=3): >>><<< 7557 1726882113.11408: done transferring module to remote 7557 1726882113.11426: _low_level_execute_command(): starting 7557 1726882113.11435: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/ /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/AnsiballZ_stat.py && sleep 0' 7557 1726882113.12060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.12078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.12098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.12119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882113.12137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882113.12215: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.12251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.12271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.12295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.12362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.14106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.14199: stdout chunk (state=3): >>><<< 7557 1726882113.14203: stderr chunk (state=3): >>><<< 7557 1726882113.14206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.14209: _low_level_execute_command(): starting 7557 1726882113.14211: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/AnsiballZ_stat.py && sleep 0' 7557 1726882113.15010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.15058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.15075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.15151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.30565: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7557 1726882113.31939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882113.31943: stdout chunk (state=3): >>><<< 7557 1726882113.31946: stderr chunk (state=3): >>><<< 7557 1726882113.32074: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882113.32078: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882113.32081: _low_level_execute_command(): starting 7557 1726882113.32083: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882113.0440748-9083-116980706021439/ > /dev/null 2>&1 && sleep 0' 7557 1726882113.32627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.32646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.32662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.32708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.32791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.32824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.32827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.32915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.34901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.34905: stderr chunk (state=3): >>><<< 7557 1726882113.34907: stdout chunk (state=3): >>><<< 7557 1726882113.34910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.34912: handler run complete 7557 1726882113.34918: attempt loop complete, returning result 7557 1726882113.34921: _execute() done 7557 1726882113.34922: dumping result to json 7557 1726882113.34924: done dumping result, returning 7557 1726882113.34926: done running TaskExecutor() for managed_node3/TASK: Stat profile file [12673a56-9f93-ed48-b3a5-000000001960] 7557 1726882113.34927: sending task result for task 12673a56-9f93-ed48-b3a5-000000001960 7557 1726882113.34991: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001960 7557 1726882113.34996: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7557 1726882113.35083: no more pending results, returning what we have 7557 1726882113.35088: results queue empty 7557 1726882113.35089: checking for any_errors_fatal 7557 1726882113.35105: done checking for any_errors_fatal 7557 1726882113.35106: checking for max_fail_percentage 7557 1726882113.35108: done checking for max_fail_percentage 7557 1726882113.35109: checking to see if all hosts have failed and the running result is not ok 7557 1726882113.35110: done checking to see if all hosts have failed 7557 1726882113.35111: getting the remaining hosts for this loop 7557 1726882113.35113: done getting the remaining hosts for this loop 7557 1726882113.35117: getting the next task for host managed_node3 7557 1726882113.35124: done getting next task for host managed_node3 7557 1726882113.35127: ^ task is: TASK: Set NM profile exist flag based on the profile files 7557 1726882113.35136: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882113.35141: getting variables 7557 1726882113.35143: in VariableManager get_vars() 7557 1726882113.35253: Calling all_inventory to load vars for managed_node3 7557 1726882113.35256: Calling groups_inventory to load vars for managed_node3 7557 1726882113.35259: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882113.35270: Calling all_plugins_play to load vars for managed_node3 7557 1726882113.35273: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882113.35276: Calling groups_plugins_play to load vars for managed_node3 7557 1726882113.37084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882113.38673: done with get_vars() 7557 1726882113.38704: done getting variables 7557 1726882113.38810: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:28:33 -0400 (0:00:00.399) 0:00:39.241 ****** 7557 1726882113.38847: entering _queue_task() for managed_node3/set_fact 7557 1726882113.39349: worker is 1 (out of 1 available) 7557 1726882113.39363: exiting _queue_task() for managed_node3/set_fact 7557 1726882113.39377: done queuing things up, now waiting for results queue to drain 7557 1726882113.39378: waiting for pending results... 7557 1726882113.39685: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7557 1726882113.39964: in run() - task 12673a56-9f93-ed48-b3a5-000000001961 7557 1726882113.39968: variable 'ansible_search_path' from source: unknown 7557 1726882113.39971: variable 'ansible_search_path' from source: unknown 7557 1726882113.39973: calling self._execute() 7557 1726882113.39979: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.39987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.40005: variable 'omit' from source: magic vars 7557 1726882113.40425: variable 'ansible_distribution_major_version' from source: facts 7557 1726882113.40437: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882113.40570: variable 'profile_stat' from source: set_fact 7557 1726882113.40615: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882113.40619: when evaluation is False, skipping this task 7557 1726882113.40621: _execute() done 7557 1726882113.40623: dumping result to json 7557 1726882113.40626: done dumping result, returning 7557 1726882113.40628: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-ed48-b3a5-000000001961] 7557 1726882113.40630: sending task result for task 12673a56-9f93-ed48-b3a5-000000001961 7557 1726882113.40942: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001961 7557 1726882113.40946: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882113.40988: no more pending results, returning what we have 7557 1726882113.40995: results queue empty 7557 1726882113.40998: checking for any_errors_fatal 7557 1726882113.41004: done checking for any_errors_fatal 7557 1726882113.41005: checking for max_fail_percentage 7557 1726882113.41007: done checking for max_fail_percentage 7557 1726882113.41008: checking to see if all hosts have failed and the running result is not ok 7557 1726882113.41009: done checking to see if all hosts have failed 7557 1726882113.41010: getting the remaining hosts for this loop 7557 1726882113.41011: done getting the remaining hosts for this loop 7557 1726882113.41015: getting the next task for host managed_node3 7557 1726882113.41021: done getting next task for host managed_node3 7557 1726882113.41023: ^ task is: TASK: Get NM profile info 7557 1726882113.41028: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882113.41031: getting variables 7557 1726882113.41033: in VariableManager get_vars() 7557 1726882113.41081: Calling all_inventory to load vars for managed_node3 7557 1726882113.41084: Calling groups_inventory to load vars for managed_node3 7557 1726882113.41086: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882113.41100: Calling all_plugins_play to load vars for managed_node3 7557 1726882113.41103: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882113.41107: Calling groups_plugins_play to load vars for managed_node3 7557 1726882113.42736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882113.44828: done with get_vars() 7557 1726882113.44965: done getting variables 7557 1726882113.45110: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:28:33 -0400 (0:00:00.062) 0:00:39.304 ****** 7557 1726882113.45144: entering _queue_task() for managed_node3/shell 7557 1726882113.45971: worker is 1 (out of 1 available) 7557 1726882113.45984: exiting _queue_task() for managed_node3/shell 7557 1726882113.46001: done queuing things up, now waiting for results queue to drain 7557 1726882113.46002: waiting for pending results... 7557 1726882113.46461: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7557 1726882113.46696: in run() - task 12673a56-9f93-ed48-b3a5-000000001962 7557 1726882113.46820: variable 'ansible_search_path' from source: unknown 7557 1726882113.46824: variable 'ansible_search_path' from source: unknown 7557 1726882113.46860: calling self._execute() 7557 1726882113.47088: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.47096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.47114: variable 'omit' from source: magic vars 7557 1726882113.47676: variable 'ansible_distribution_major_version' from source: facts 7557 1726882113.47680: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882113.47682: variable 'omit' from source: magic vars 7557 1726882113.47691: variable 'omit' from source: magic vars 7557 1726882113.47808: variable 'profile' from source: include params 7557 1726882113.47811: variable 'interface' from source: play vars 7557 1726882113.47878: variable 'interface' from source: play vars 7557 1726882113.47908: variable 'omit' from source: magic vars 7557 1726882113.47950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882113.47990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882113.48016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882113.48034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882113.48048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882113.48086: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882113.48089: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.48092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.48215: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882113.48224: Set connection var ansible_shell_executable to /bin/sh 7557 1726882113.48227: Set connection var ansible_shell_type to sh 7557 1726882113.48233: Set connection var ansible_pipelining to False 7557 1726882113.48236: Set connection var ansible_connection to ssh 7557 1726882113.48399: Set connection var ansible_timeout to 10 7557 1726882113.48403: variable 'ansible_shell_executable' from source: unknown 7557 1726882113.48406: variable 'ansible_connection' from source: unknown 7557 1726882113.48408: variable 'ansible_module_compression' from source: unknown 7557 1726882113.48411: variable 'ansible_shell_type' from source: unknown 7557 1726882113.48413: variable 'ansible_shell_executable' from source: unknown 7557 1726882113.48415: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.48418: variable 'ansible_pipelining' from source: unknown 7557 1726882113.48421: variable 'ansible_timeout' from source: unknown 7557 1726882113.48423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.48434: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882113.48701: variable 'omit' from source: magic vars 7557 1726882113.48705: starting attempt loop 7557 1726882113.48708: running the handler 7557 1726882113.48711: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882113.48713: _low_level_execute_command(): starting 7557 1726882113.48715: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882113.49296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.49359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.49382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.49398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.49475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.51079: stdout chunk (state=3): >>>/root <<< 7557 1726882113.51401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.51405: stdout chunk (state=3): >>><<< 7557 1726882113.51407: stderr chunk (state=3): >>><<< 7557 1726882113.51411: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.51413: _low_level_execute_command(): starting 7557 1726882113.51416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651 `" && echo ansible-tmp-1726882113.5133798-9104-214628528310651="` echo /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651 `" ) && sleep 0' 7557 1726882113.52019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.52029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.52040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.52063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882113.52076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882113.52083: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882113.52095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.52121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882113.52130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882113.52138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882113.52147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.52165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.52177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882113.52185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882113.52207: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.52278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.52305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.52372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.54199: stdout chunk (state=3): >>>ansible-tmp-1726882113.5133798-9104-214628528310651=/root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651 <<< 7557 1726882113.54355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.54358: stdout chunk (state=3): >>><<< 7557 1726882113.54361: stderr chunk (state=3): >>><<< 7557 1726882113.54600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882113.5133798-9104-214628528310651=/root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.54603: variable 'ansible_module_compression' from source: unknown 7557 1726882113.54606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882113.54608: variable 'ansible_facts' from source: unknown 7557 1726882113.54610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/AnsiballZ_command.py 7557 1726882113.54746: Sending initial data 7557 1726882113.54847: Sent initial data (154 bytes) 7557 1726882113.55383: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.55387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.55407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.55504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.55516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.55598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.57104: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882113.57151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882113.57199: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp2ysz3m1x /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/AnsiballZ_command.py <<< 7557 1726882113.57203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/AnsiballZ_command.py" <<< 7557 1726882113.57244: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp2ysz3m1x" to remote "/root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/AnsiballZ_command.py" <<< 7557 1726882113.58052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.58056: stdout chunk (state=3): >>><<< 7557 1726882113.58128: stderr chunk (state=3): >>><<< 7557 1726882113.58138: done transferring module to remote 7557 1726882113.58160: _low_level_execute_command(): starting 7557 1726882113.58172: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/ /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/AnsiballZ_command.py && sleep 0' 7557 1726882113.58909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882113.58961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.58984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.59037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.59071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.60805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.60808: stdout chunk (state=3): >>><<< 7557 1726882113.60810: stderr chunk (state=3): >>><<< 7557 1726882113.60825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.60837: _low_level_execute_command(): starting 7557 1726882113.60910: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/AnsiballZ_command.py && sleep 0' 7557 1726882113.61429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882113.61441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882113.61456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.61475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882113.61512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882113.61529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882113.61609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.61624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.61648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.61725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.78824: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:28:33.768080", "end": "2024-09-20 21:28:33.785587", "delta": "0:00:00.017507", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882113.80322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882113.80437: stderr chunk (state=3): >>><<< 7557 1726882113.80441: stdout chunk (state=3): >>><<< 7557 1726882113.80443: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:28:33.768080", "end": "2024-09-20 21:28:33.785587", "delta": "0:00:00.017507", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882113.80447: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882113.80461: _low_level_execute_command(): starting 7557 1726882113.80470: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882113.5133798-9104-214628528310651/ > /dev/null 2>&1 && sleep 0' 7557 1726882113.81208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882113.81244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882113.81260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882113.81361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882113.83182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882113.83199: stdout chunk (state=3): >>><<< 7557 1726882113.83209: stderr chunk (state=3): >>><<< 7557 1726882113.83229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882113.83399: handler run complete 7557 1726882113.83402: Evaluated conditional (False): False 7557 1726882113.83404: attempt loop complete, returning result 7557 1726882113.83406: _execute() done 7557 1726882113.83408: dumping result to json 7557 1726882113.83410: done dumping result, returning 7557 1726882113.83412: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [12673a56-9f93-ed48-b3a5-000000001962] 7557 1726882113.83414: sending task result for task 12673a56-9f93-ed48-b3a5-000000001962 7557 1726882113.83486: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001962 7557 1726882113.83489: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.017507", "end": "2024-09-20 21:28:33.785587", "rc": 0, "start": "2024-09-20 21:28:33.768080" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7557 1726882113.83567: no more pending results, returning what we have 7557 1726882113.83572: results queue empty 7557 1726882113.83574: checking for any_errors_fatal 7557 1726882113.83581: done checking for any_errors_fatal 7557 1726882113.83582: checking for max_fail_percentage 7557 1726882113.83584: done checking for max_fail_percentage 7557 1726882113.83585: checking to see if all hosts have failed and the running result is not ok 7557 1726882113.83586: done checking to see if all hosts have failed 7557 1726882113.83587: getting the remaining hosts for this loop 7557 1726882113.83588: done getting the remaining hosts for this loop 7557 1726882113.83592: getting the next task for host managed_node3 7557 1726882113.83608: done getting next task for host managed_node3 7557 1726882113.83612: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7557 1726882113.83617: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882113.83622: getting variables 7557 1726882113.83624: in VariableManager get_vars() 7557 1726882113.83678: Calling all_inventory to load vars for managed_node3 7557 1726882113.83681: Calling groups_inventory to load vars for managed_node3 7557 1726882113.83684: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882113.83823: Calling all_plugins_play to load vars for managed_node3 7557 1726882113.83828: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882113.83832: Calling groups_plugins_play to load vars for managed_node3 7557 1726882113.85416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882113.87050: done with get_vars() 7557 1726882113.87078: done getting variables 7557 1726882113.87138: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:28:33 -0400 (0:00:00.420) 0:00:39.724 ****** 7557 1726882113.87175: entering _queue_task() for managed_node3/set_fact 7557 1726882113.87513: worker is 1 (out of 1 available) 7557 1726882113.87527: exiting _queue_task() for managed_node3/set_fact 7557 1726882113.87541: done queuing things up, now waiting for results queue to drain 7557 1726882113.87543: waiting for pending results... 7557 1726882113.87864: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7557 1726882113.87885: in run() - task 12673a56-9f93-ed48-b3a5-000000001963 7557 1726882113.87897: variable 'ansible_search_path' from source: unknown 7557 1726882113.87903: variable 'ansible_search_path' from source: unknown 7557 1726882113.87956: calling self._execute() 7557 1726882113.88064: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.88068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.88072: variable 'omit' from source: magic vars 7557 1726882113.88432: variable 'ansible_distribution_major_version' from source: facts 7557 1726882113.88459: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882113.88614: variable 'nm_profile_exists' from source: set_fact 7557 1726882113.88618: Evaluated conditional (nm_profile_exists.rc == 0): True 7557 1726882113.88620: variable 'omit' from source: magic vars 7557 1726882113.88635: variable 'omit' from source: magic vars 7557 1726882113.88666: variable 'omit' from source: magic vars 7557 1726882113.88731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882113.88744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882113.88785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882113.88789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882113.88792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882113.88825: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882113.88829: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.88831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.88955: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882113.88959: Set connection var ansible_shell_executable to /bin/sh 7557 1726882113.88962: Set connection var ansible_shell_type to sh 7557 1726882113.88964: Set connection var ansible_pipelining to False 7557 1726882113.88966: Set connection var ansible_connection to ssh 7557 1726882113.88968: Set connection var ansible_timeout to 10 7557 1726882113.89000: variable 'ansible_shell_executable' from source: unknown 7557 1726882113.89003: variable 'ansible_connection' from source: unknown 7557 1726882113.89005: variable 'ansible_module_compression' from source: unknown 7557 1726882113.89007: variable 'ansible_shell_type' from source: unknown 7557 1726882113.89010: variable 'ansible_shell_executable' from source: unknown 7557 1726882113.89013: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.89015: variable 'ansible_pipelining' from source: unknown 7557 1726882113.89017: variable 'ansible_timeout' from source: unknown 7557 1726882113.89019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.89212: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882113.89216: variable 'omit' from source: magic vars 7557 1726882113.89218: starting attempt loop 7557 1726882113.89221: running the handler 7557 1726882113.89223: handler run complete 7557 1726882113.89225: attempt loop complete, returning result 7557 1726882113.89227: _execute() done 7557 1726882113.89229: dumping result to json 7557 1726882113.89231: done dumping result, returning 7557 1726882113.89234: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-ed48-b3a5-000000001963] 7557 1726882113.89236: sending task result for task 12673a56-9f93-ed48-b3a5-000000001963 7557 1726882113.89296: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001963 7557 1726882113.89300: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7557 1726882113.89363: no more pending results, returning what we have 7557 1726882113.89366: results queue empty 7557 1726882113.89367: checking for any_errors_fatal 7557 1726882113.89375: done checking for any_errors_fatal 7557 1726882113.89375: checking for max_fail_percentage 7557 1726882113.89377: done checking for max_fail_percentage 7557 1726882113.89378: checking to see if all hosts have failed and the running result is not ok 7557 1726882113.89379: done checking to see if all hosts have failed 7557 1726882113.89379: getting the remaining hosts for this loop 7557 1726882113.89381: done getting the remaining hosts for this loop 7557 1726882113.89384: getting the next task for host managed_node3 7557 1726882113.89391: done getting next task for host managed_node3 7557 1726882113.89395: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7557 1726882113.89400: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882113.89404: getting variables 7557 1726882113.89405: in VariableManager get_vars() 7557 1726882113.89448: Calling all_inventory to load vars for managed_node3 7557 1726882113.89450: Calling groups_inventory to load vars for managed_node3 7557 1726882113.89452: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882113.89460: Calling all_plugins_play to load vars for managed_node3 7557 1726882113.89463: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882113.89465: Calling groups_plugins_play to load vars for managed_node3 7557 1726882113.90945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882113.92455: done with get_vars() 7557 1726882113.92482: done getting variables 7557 1726882113.92543: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882113.92658: variable 'profile' from source: include params 7557 1726882113.92663: variable 'interface' from source: play vars 7557 1726882113.92725: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:28:33 -0400 (0:00:00.055) 0:00:39.780 ****** 7557 1726882113.92764: entering _queue_task() for managed_node3/command 7557 1726882113.93103: worker is 1 (out of 1 available) 7557 1726882113.93118: exiting _queue_task() for managed_node3/command 7557 1726882113.93132: done queuing things up, now waiting for results queue to drain 7557 1726882113.93133: waiting for pending results... 7557 1726882113.93527: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7557 1726882113.93628: in run() - task 12673a56-9f93-ed48-b3a5-000000001965 7557 1726882113.93633: variable 'ansible_search_path' from source: unknown 7557 1726882113.93636: variable 'ansible_search_path' from source: unknown 7557 1726882113.93639: calling self._execute() 7557 1726882113.93712: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.93723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.93733: variable 'omit' from source: magic vars 7557 1726882113.94096: variable 'ansible_distribution_major_version' from source: facts 7557 1726882113.94108: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882113.94198: variable 'profile_stat' from source: set_fact 7557 1726882113.94212: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882113.94216: when evaluation is False, skipping this task 7557 1726882113.94219: _execute() done 7557 1726882113.94221: dumping result to json 7557 1726882113.94224: done dumping result, returning 7557 1726882113.94229: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000001965] 7557 1726882113.94234: sending task result for task 12673a56-9f93-ed48-b3a5-000000001965 7557 1726882113.94319: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001965 7557 1726882113.94322: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882113.94369: no more pending results, returning what we have 7557 1726882113.94373: results queue empty 7557 1726882113.94374: checking for any_errors_fatal 7557 1726882113.94380: done checking for any_errors_fatal 7557 1726882113.94381: checking for max_fail_percentage 7557 1726882113.94383: done checking for max_fail_percentage 7557 1726882113.94383: checking to see if all hosts have failed and the running result is not ok 7557 1726882113.94384: done checking to see if all hosts have failed 7557 1726882113.94385: getting the remaining hosts for this loop 7557 1726882113.94386: done getting the remaining hosts for this loop 7557 1726882113.94390: getting the next task for host managed_node3 7557 1726882113.94399: done getting next task for host managed_node3 7557 1726882113.94401: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7557 1726882113.94406: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882113.94411: getting variables 7557 1726882113.94412: in VariableManager get_vars() 7557 1726882113.94462: Calling all_inventory to load vars for managed_node3 7557 1726882113.94464: Calling groups_inventory to load vars for managed_node3 7557 1726882113.94467: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882113.94477: Calling all_plugins_play to load vars for managed_node3 7557 1726882113.94479: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882113.94481: Calling groups_plugins_play to load vars for managed_node3 7557 1726882113.95279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882113.96517: done with get_vars() 7557 1726882113.96551: done getting variables 7557 1726882113.96615: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882113.96736: variable 'profile' from source: include params 7557 1726882113.96740: variable 'interface' from source: play vars 7557 1726882113.96802: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:28:33 -0400 (0:00:00.040) 0:00:39.821 ****** 7557 1726882113.96836: entering _queue_task() for managed_node3/set_fact 7557 1726882113.97091: worker is 1 (out of 1 available) 7557 1726882113.97107: exiting _queue_task() for managed_node3/set_fact 7557 1726882113.97121: done queuing things up, now waiting for results queue to drain 7557 1726882113.97122: waiting for pending results... 7557 1726882113.97301: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7557 1726882113.97384: in run() - task 12673a56-9f93-ed48-b3a5-000000001966 7557 1726882113.97396: variable 'ansible_search_path' from source: unknown 7557 1726882113.97399: variable 'ansible_search_path' from source: unknown 7557 1726882113.97430: calling self._execute() 7557 1726882113.97513: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882113.97517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882113.97526: variable 'omit' from source: magic vars 7557 1726882113.97790: variable 'ansible_distribution_major_version' from source: facts 7557 1726882113.97803: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882113.97889: variable 'profile_stat' from source: set_fact 7557 1726882113.97905: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882113.97908: when evaluation is False, skipping this task 7557 1726882113.97911: _execute() done 7557 1726882113.97914: dumping result to json 7557 1726882113.97916: done dumping result, returning 7557 1726882113.97920: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000001966] 7557 1726882113.97926: sending task result for task 12673a56-9f93-ed48-b3a5-000000001966 7557 1726882113.98018: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001966 7557 1726882113.98022: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882113.98063: no more pending results, returning what we have 7557 1726882113.98067: results queue empty 7557 1726882113.98068: checking for any_errors_fatal 7557 1726882113.98077: done checking for any_errors_fatal 7557 1726882113.98078: checking for max_fail_percentage 7557 1726882113.98079: done checking for max_fail_percentage 7557 1726882113.98080: checking to see if all hosts have failed and the running result is not ok 7557 1726882113.98081: done checking to see if all hosts have failed 7557 1726882113.98082: getting the remaining hosts for this loop 7557 1726882113.98083: done getting the remaining hosts for this loop 7557 1726882113.98087: getting the next task for host managed_node3 7557 1726882113.98096: done getting next task for host managed_node3 7557 1726882113.98099: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7557 1726882113.98109: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882113.98116: getting variables 7557 1726882113.98117: in VariableManager get_vars() 7557 1726882113.98164: Calling all_inventory to load vars for managed_node3 7557 1726882113.98166: Calling groups_inventory to load vars for managed_node3 7557 1726882113.98168: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882113.98180: Calling all_plugins_play to load vars for managed_node3 7557 1726882113.98182: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882113.98185: Calling groups_plugins_play to load vars for managed_node3 7557 1726882113.99359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.00684: done with get_vars() 7557 1726882114.00707: done getting variables 7557 1726882114.00749: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882114.00834: variable 'profile' from source: include params 7557 1726882114.00837: variable 'interface' from source: play vars 7557 1726882114.00877: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:28:34 -0400 (0:00:00.040) 0:00:39.861 ****** 7557 1726882114.00903: entering _queue_task() for managed_node3/command 7557 1726882114.01154: worker is 1 (out of 1 available) 7557 1726882114.01166: exiting _queue_task() for managed_node3/command 7557 1726882114.01180: done queuing things up, now waiting for results queue to drain 7557 1726882114.01181: waiting for pending results... 7557 1726882114.01356: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7557 1726882114.01441: in run() - task 12673a56-9f93-ed48-b3a5-000000001967 7557 1726882114.01452: variable 'ansible_search_path' from source: unknown 7557 1726882114.01455: variable 'ansible_search_path' from source: unknown 7557 1726882114.01484: calling self._execute() 7557 1726882114.01564: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.01568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.01577: variable 'omit' from source: magic vars 7557 1726882114.01845: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.01859: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.01938: variable 'profile_stat' from source: set_fact 7557 1726882114.01953: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882114.01956: when evaluation is False, skipping this task 7557 1726882114.01958: _execute() done 7557 1726882114.01963: dumping result to json 7557 1726882114.01965: done dumping result, returning 7557 1726882114.01971: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000001967] 7557 1726882114.01985: sending task result for task 12673a56-9f93-ed48-b3a5-000000001967 7557 1726882114.02077: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001967 7557 1726882114.02081: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882114.02131: no more pending results, returning what we have 7557 1726882114.02135: results queue empty 7557 1726882114.02136: checking for any_errors_fatal 7557 1726882114.02143: done checking for any_errors_fatal 7557 1726882114.02144: checking for max_fail_percentage 7557 1726882114.02145: done checking for max_fail_percentage 7557 1726882114.02147: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.02147: done checking to see if all hosts have failed 7557 1726882114.02148: getting the remaining hosts for this loop 7557 1726882114.02149: done getting the remaining hosts for this loop 7557 1726882114.02152: getting the next task for host managed_node3 7557 1726882114.02159: done getting next task for host managed_node3 7557 1726882114.02161: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7557 1726882114.02166: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.02170: getting variables 7557 1726882114.02171: in VariableManager get_vars() 7557 1726882114.02223: Calling all_inventory to load vars for managed_node3 7557 1726882114.02225: Calling groups_inventory to load vars for managed_node3 7557 1726882114.02228: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.02239: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.02242: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.02244: Calling groups_plugins_play to load vars for managed_node3 7557 1726882114.03602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.05002: done with get_vars() 7557 1726882114.05022: done getting variables 7557 1726882114.05083: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882114.05169: variable 'profile' from source: include params 7557 1726882114.05174: variable 'interface' from source: play vars 7557 1726882114.05217: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:28:34 -0400 (0:00:00.043) 0:00:39.905 ****** 7557 1726882114.05242: entering _queue_task() for managed_node3/set_fact 7557 1726882114.05484: worker is 1 (out of 1 available) 7557 1726882114.05499: exiting _queue_task() for managed_node3/set_fact 7557 1726882114.05513: done queuing things up, now waiting for results queue to drain 7557 1726882114.05514: waiting for pending results... 7557 1726882114.05689: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7557 1726882114.05775: in run() - task 12673a56-9f93-ed48-b3a5-000000001968 7557 1726882114.05786: variable 'ansible_search_path' from source: unknown 7557 1726882114.05789: variable 'ansible_search_path' from source: unknown 7557 1726882114.05821: calling self._execute() 7557 1726882114.05902: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.05907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.05917: variable 'omit' from source: magic vars 7557 1726882114.06182: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.06194: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.06273: variable 'profile_stat' from source: set_fact 7557 1726882114.06289: Evaluated conditional (profile_stat.stat.exists): False 7557 1726882114.06292: when evaluation is False, skipping this task 7557 1726882114.06296: _execute() done 7557 1726882114.06298: dumping result to json 7557 1726882114.06301: done dumping result, returning 7557 1726882114.06309: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [12673a56-9f93-ed48-b3a5-000000001968] 7557 1726882114.06314: sending task result for task 12673a56-9f93-ed48-b3a5-000000001968 7557 1726882114.06400: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001968 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7557 1726882114.06447: no more pending results, returning what we have 7557 1726882114.06451: results queue empty 7557 1726882114.06452: checking for any_errors_fatal 7557 1726882114.06458: done checking for any_errors_fatal 7557 1726882114.06459: checking for max_fail_percentage 7557 1726882114.06460: done checking for max_fail_percentage 7557 1726882114.06461: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.06462: done checking to see if all hosts have failed 7557 1726882114.06463: getting the remaining hosts for this loop 7557 1726882114.06464: done getting the remaining hosts for this loop 7557 1726882114.06467: getting the next task for host managed_node3 7557 1726882114.06475: done getting next task for host managed_node3 7557 1726882114.06477: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7557 1726882114.06482: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.06487: getting variables 7557 1726882114.06488: in VariableManager get_vars() 7557 1726882114.06541: Calling all_inventory to load vars for managed_node3 7557 1726882114.06543: Calling groups_inventory to load vars for managed_node3 7557 1726882114.06545: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.06556: WORKER PROCESS EXITING 7557 1726882114.06566: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.06569: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.06571: Calling groups_plugins_play to load vars for managed_node3 7557 1726882114.07911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.08769: done with get_vars() 7557 1726882114.08784: done getting variables 7557 1726882114.08831: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882114.08910: variable 'profile' from source: include params 7557 1726882114.08913: variable 'interface' from source: play vars 7557 1726882114.08954: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:28:34 -0400 (0:00:00.037) 0:00:39.942 ****** 7557 1726882114.08976: entering _queue_task() for managed_node3/assert 7557 1726882114.09205: worker is 1 (out of 1 available) 7557 1726882114.09218: exiting _queue_task() for managed_node3/assert 7557 1726882114.09232: done queuing things up, now waiting for results queue to drain 7557 1726882114.09233: waiting for pending results... 7557 1726882114.09402: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7557 1726882114.09464: in run() - task 12673a56-9f93-ed48-b3a5-0000000016d3 7557 1726882114.09477: variable 'ansible_search_path' from source: unknown 7557 1726882114.09482: variable 'ansible_search_path' from source: unknown 7557 1726882114.09511: calling self._execute() 7557 1726882114.09589: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.09597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.09606: variable 'omit' from source: magic vars 7557 1726882114.09865: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.09875: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.09880: variable 'omit' from source: magic vars 7557 1726882114.09915: variable 'omit' from source: magic vars 7557 1726882114.09979: variable 'profile' from source: include params 7557 1726882114.09983: variable 'interface' from source: play vars 7557 1726882114.10034: variable 'interface' from source: play vars 7557 1726882114.10049: variable 'omit' from source: magic vars 7557 1726882114.10081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882114.10110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882114.10127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882114.10140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.10150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.10173: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882114.10176: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.10179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.10254: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882114.10259: Set connection var ansible_shell_executable to /bin/sh 7557 1726882114.10262: Set connection var ansible_shell_type to sh 7557 1726882114.10268: Set connection var ansible_pipelining to False 7557 1726882114.10270: Set connection var ansible_connection to ssh 7557 1726882114.10274: Set connection var ansible_timeout to 10 7557 1726882114.10290: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.10297: variable 'ansible_connection' from source: unknown 7557 1726882114.10299: variable 'ansible_module_compression' from source: unknown 7557 1726882114.10302: variable 'ansible_shell_type' from source: unknown 7557 1726882114.10304: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.10306: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.10308: variable 'ansible_pipelining' from source: unknown 7557 1726882114.10311: variable 'ansible_timeout' from source: unknown 7557 1726882114.10313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.10413: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882114.10422: variable 'omit' from source: magic vars 7557 1726882114.10429: starting attempt loop 7557 1726882114.10432: running the handler 7557 1726882114.10509: variable 'lsr_net_profile_exists' from source: set_fact 7557 1726882114.10513: Evaluated conditional (lsr_net_profile_exists): True 7557 1726882114.10518: handler run complete 7557 1726882114.10529: attempt loop complete, returning result 7557 1726882114.10532: _execute() done 7557 1726882114.10535: dumping result to json 7557 1726882114.10537: done dumping result, returning 7557 1726882114.10546: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [12673a56-9f93-ed48-b3a5-0000000016d3] 7557 1726882114.10550: sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d3 7557 1726882114.10627: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d3 7557 1726882114.10630: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882114.10697: no more pending results, returning what we have 7557 1726882114.10701: results queue empty 7557 1726882114.10702: checking for any_errors_fatal 7557 1726882114.10708: done checking for any_errors_fatal 7557 1726882114.10709: checking for max_fail_percentage 7557 1726882114.10710: done checking for max_fail_percentage 7557 1726882114.10711: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.10712: done checking to see if all hosts have failed 7557 1726882114.10713: getting the remaining hosts for this loop 7557 1726882114.10714: done getting the remaining hosts for this loop 7557 1726882114.10717: getting the next task for host managed_node3 7557 1726882114.10722: done getting next task for host managed_node3 7557 1726882114.10724: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7557 1726882114.10727: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.10730: getting variables 7557 1726882114.10732: in VariableManager get_vars() 7557 1726882114.10774: Calling all_inventory to load vars for managed_node3 7557 1726882114.10777: Calling groups_inventory to load vars for managed_node3 7557 1726882114.10779: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.10788: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.10790: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.10796: Calling groups_plugins_play to load vars for managed_node3 7557 1726882114.11561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.16333: done with get_vars() 7557 1726882114.16350: done getting variables 7557 1726882114.16384: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882114.16457: variable 'profile' from source: include params 7557 1726882114.16459: variable 'interface' from source: play vars 7557 1726882114.16501: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:28:34 -0400 (0:00:00.075) 0:00:40.018 ****** 7557 1726882114.16524: entering _queue_task() for managed_node3/assert 7557 1726882114.16776: worker is 1 (out of 1 available) 7557 1726882114.16791: exiting _queue_task() for managed_node3/assert 7557 1726882114.16807: done queuing things up, now waiting for results queue to drain 7557 1726882114.16810: waiting for pending results... 7557 1726882114.16978: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7557 1726882114.17057: in run() - task 12673a56-9f93-ed48-b3a5-0000000016d4 7557 1726882114.17068: variable 'ansible_search_path' from source: unknown 7557 1726882114.17073: variable 'ansible_search_path' from source: unknown 7557 1726882114.17104: calling self._execute() 7557 1726882114.17186: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.17191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.17203: variable 'omit' from source: magic vars 7557 1726882114.17473: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.17482: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.17488: variable 'omit' from source: magic vars 7557 1726882114.17518: variable 'omit' from source: magic vars 7557 1726882114.17589: variable 'profile' from source: include params 7557 1726882114.17597: variable 'interface' from source: play vars 7557 1726882114.17641: variable 'interface' from source: play vars 7557 1726882114.17655: variable 'omit' from source: magic vars 7557 1726882114.17689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882114.17718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882114.17733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882114.17746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.17756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.17782: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882114.17785: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.17788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.17861: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882114.17868: Set connection var ansible_shell_executable to /bin/sh 7557 1726882114.17871: Set connection var ansible_shell_type to sh 7557 1726882114.17875: Set connection var ansible_pipelining to False 7557 1726882114.17877: Set connection var ansible_connection to ssh 7557 1726882114.17882: Set connection var ansible_timeout to 10 7557 1726882114.17903: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.17907: variable 'ansible_connection' from source: unknown 7557 1726882114.17909: variable 'ansible_module_compression' from source: unknown 7557 1726882114.17913: variable 'ansible_shell_type' from source: unknown 7557 1726882114.17915: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.17918: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.17920: variable 'ansible_pipelining' from source: unknown 7557 1726882114.17923: variable 'ansible_timeout' from source: unknown 7557 1726882114.17926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.18023: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882114.18032: variable 'omit' from source: magic vars 7557 1726882114.18042: starting attempt loop 7557 1726882114.18045: running the handler 7557 1726882114.18115: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7557 1726882114.18118: Evaluated conditional (lsr_net_profile_ansible_managed): True 7557 1726882114.18124: handler run complete 7557 1726882114.18135: attempt loop complete, returning result 7557 1726882114.18138: _execute() done 7557 1726882114.18141: dumping result to json 7557 1726882114.18145: done dumping result, returning 7557 1726882114.18151: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [12673a56-9f93-ed48-b3a5-0000000016d4] 7557 1726882114.18160: sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d4 7557 1726882114.18238: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d4 7557 1726882114.18241: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882114.18309: no more pending results, returning what we have 7557 1726882114.18313: results queue empty 7557 1726882114.18314: checking for any_errors_fatal 7557 1726882114.18319: done checking for any_errors_fatal 7557 1726882114.18320: checking for max_fail_percentage 7557 1726882114.18321: done checking for max_fail_percentage 7557 1726882114.18322: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.18323: done checking to see if all hosts have failed 7557 1726882114.18323: getting the remaining hosts for this loop 7557 1726882114.18325: done getting the remaining hosts for this loop 7557 1726882114.18328: getting the next task for host managed_node3 7557 1726882114.18333: done getting next task for host managed_node3 7557 1726882114.18335: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7557 1726882114.18338: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.18341: getting variables 7557 1726882114.18343: in VariableManager get_vars() 7557 1726882114.18391: Calling all_inventory to load vars for managed_node3 7557 1726882114.18397: Calling groups_inventory to load vars for managed_node3 7557 1726882114.18400: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.18409: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.18412: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.18414: Calling groups_plugins_play to load vars for managed_node3 7557 1726882114.19173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.20047: done with get_vars() 7557 1726882114.20062: done getting variables 7557 1726882114.20104: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882114.20177: variable 'profile' from source: include params 7557 1726882114.20179: variable 'interface' from source: play vars 7557 1726882114.20222: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:28:34 -0400 (0:00:00.037) 0:00:40.055 ****** 7557 1726882114.20248: entering _queue_task() for managed_node3/assert 7557 1726882114.20465: worker is 1 (out of 1 available) 7557 1726882114.20479: exiting _queue_task() for managed_node3/assert 7557 1726882114.20491: done queuing things up, now waiting for results queue to drain 7557 1726882114.20496: waiting for pending results... 7557 1726882114.20659: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7557 1726882114.20727: in run() - task 12673a56-9f93-ed48-b3a5-0000000016d5 7557 1726882114.20739: variable 'ansible_search_path' from source: unknown 7557 1726882114.20744: variable 'ansible_search_path' from source: unknown 7557 1726882114.20770: calling self._execute() 7557 1726882114.20857: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.20863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.20871: variable 'omit' from source: magic vars 7557 1726882114.21130: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.21140: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.21145: variable 'omit' from source: magic vars 7557 1726882114.21177: variable 'omit' from source: magic vars 7557 1726882114.21246: variable 'profile' from source: include params 7557 1726882114.21250: variable 'interface' from source: play vars 7557 1726882114.21298: variable 'interface' from source: play vars 7557 1726882114.21312: variable 'omit' from source: magic vars 7557 1726882114.21343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882114.21370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882114.21388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882114.21404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.21414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.21438: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882114.21441: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.21443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.21516: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882114.21522: Set connection var ansible_shell_executable to /bin/sh 7557 1726882114.21525: Set connection var ansible_shell_type to sh 7557 1726882114.21530: Set connection var ansible_pipelining to False 7557 1726882114.21532: Set connection var ansible_connection to ssh 7557 1726882114.21537: Set connection var ansible_timeout to 10 7557 1726882114.21554: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.21557: variable 'ansible_connection' from source: unknown 7557 1726882114.21559: variable 'ansible_module_compression' from source: unknown 7557 1726882114.21562: variable 'ansible_shell_type' from source: unknown 7557 1726882114.21564: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.21566: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.21569: variable 'ansible_pipelining' from source: unknown 7557 1726882114.21571: variable 'ansible_timeout' from source: unknown 7557 1726882114.21575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.21675: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882114.21685: variable 'omit' from source: magic vars 7557 1726882114.21689: starting attempt loop 7557 1726882114.21696: running the handler 7557 1726882114.21768: variable 'lsr_net_profile_fingerprint' from source: set_fact 7557 1726882114.21771: Evaluated conditional (lsr_net_profile_fingerprint): True 7557 1726882114.21777: handler run complete 7557 1726882114.21788: attempt loop complete, returning result 7557 1726882114.21791: _execute() done 7557 1726882114.21797: dumping result to json 7557 1726882114.21800: done dumping result, returning 7557 1726882114.21803: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [12673a56-9f93-ed48-b3a5-0000000016d5] 7557 1726882114.21809: sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d5 7557 1726882114.21890: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000016d5 7557 1726882114.21896: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882114.21955: no more pending results, returning what we have 7557 1726882114.21959: results queue empty 7557 1726882114.21960: checking for any_errors_fatal 7557 1726882114.21966: done checking for any_errors_fatal 7557 1726882114.21966: checking for max_fail_percentage 7557 1726882114.21968: done checking for max_fail_percentage 7557 1726882114.21970: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.21971: done checking to see if all hosts have failed 7557 1726882114.21971: getting the remaining hosts for this loop 7557 1726882114.21973: done getting the remaining hosts for this loop 7557 1726882114.21975: getting the next task for host managed_node3 7557 1726882114.21982: done getting next task for host managed_node3 7557 1726882114.21984: ^ task is: TASK: Show ipv4 routes 7557 1726882114.21986: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.21990: getting variables 7557 1726882114.21991: in VariableManager get_vars() 7557 1726882114.22041: Calling all_inventory to load vars for managed_node3 7557 1726882114.22044: Calling groups_inventory to load vars for managed_node3 7557 1726882114.22046: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.22055: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.22058: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.22060: Calling groups_plugins_play to load vars for managed_node3 7557 1726882114.22989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.23852: done with get_vars() 7557 1726882114.23869: done getting variables 7557 1726882114.23916: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:114 Friday 20 September 2024 21:28:34 -0400 (0:00:00.036) 0:00:40.092 ****** 7557 1726882114.23940: entering _queue_task() for managed_node3/command 7557 1726882114.24191: worker is 1 (out of 1 available) 7557 1726882114.24207: exiting _queue_task() for managed_node3/command 7557 1726882114.24220: done queuing things up, now waiting for results queue to drain 7557 1726882114.24221: waiting for pending results... 7557 1726882114.24403: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7557 1726882114.24460: in run() - task 12673a56-9f93-ed48-b3a5-0000000000ff 7557 1726882114.24471: variable 'ansible_search_path' from source: unknown 7557 1726882114.24503: calling self._execute() 7557 1726882114.24586: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.24590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.24601: variable 'omit' from source: magic vars 7557 1726882114.24879: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.24890: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.24898: variable 'omit' from source: magic vars 7557 1726882114.24912: variable 'omit' from source: magic vars 7557 1726882114.24937: variable 'omit' from source: magic vars 7557 1726882114.24969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882114.25005: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882114.25018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882114.25031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.25042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.25066: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882114.25069: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.25071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.25147: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882114.25153: Set connection var ansible_shell_executable to /bin/sh 7557 1726882114.25156: Set connection var ansible_shell_type to sh 7557 1726882114.25161: Set connection var ansible_pipelining to False 7557 1726882114.25164: Set connection var ansible_connection to ssh 7557 1726882114.25168: Set connection var ansible_timeout to 10 7557 1726882114.25187: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.25190: variable 'ansible_connection' from source: unknown 7557 1726882114.25197: variable 'ansible_module_compression' from source: unknown 7557 1726882114.25200: variable 'ansible_shell_type' from source: unknown 7557 1726882114.25203: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.25206: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.25208: variable 'ansible_pipelining' from source: unknown 7557 1726882114.25212: variable 'ansible_timeout' from source: unknown 7557 1726882114.25215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.25312: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882114.25323: variable 'omit' from source: magic vars 7557 1726882114.25328: starting attempt loop 7557 1726882114.25330: running the handler 7557 1726882114.25346: _low_level_execute_command(): starting 7557 1726882114.25353: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882114.25871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.25874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.25879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.25882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.25936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.25940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.25942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.26009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.27672: stdout chunk (state=3): >>>/root <<< 7557 1726882114.27767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.27801: stderr chunk (state=3): >>><<< 7557 1726882114.27804: stdout chunk (state=3): >>><<< 7557 1726882114.27832: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.27845: _low_level_execute_command(): starting 7557 1726882114.27851: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373 `" && echo ansible-tmp-1726882114.2783203-9132-213684523523373="` echo /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373 `" ) && sleep 0' 7557 1726882114.28282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882114.28289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882114.28319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.28332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882114.28334: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.28382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.28385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.28390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.28437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.30291: stdout chunk (state=3): >>>ansible-tmp-1726882114.2783203-9132-213684523523373=/root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373 <<< 7557 1726882114.30391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.30428: stderr chunk (state=3): >>><<< 7557 1726882114.30431: stdout chunk (state=3): >>><<< 7557 1726882114.30446: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882114.2783203-9132-213684523523373=/root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.30472: variable 'ansible_module_compression' from source: unknown 7557 1726882114.30515: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882114.30548: variable 'ansible_facts' from source: unknown 7557 1726882114.30604: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/AnsiballZ_command.py 7557 1726882114.30710: Sending initial data 7557 1726882114.30713: Sent initial data (154 bytes) 7557 1726882114.31166: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.31169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882114.31172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.31174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.31176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.31218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.31234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.31283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.32806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882114.32846: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882114.32895: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp1lkmnx7l /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/AnsiballZ_command.py <<< 7557 1726882114.32903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/AnsiballZ_command.py" <<< 7557 1726882114.32940: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp1lkmnx7l" to remote "/root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/AnsiballZ_command.py" <<< 7557 1726882114.32943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/AnsiballZ_command.py" <<< 7557 1726882114.33476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.33517: stderr chunk (state=3): >>><<< 7557 1726882114.33521: stdout chunk (state=3): >>><<< 7557 1726882114.33560: done transferring module to remote 7557 1726882114.33568: _low_level_execute_command(): starting 7557 1726882114.33573: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/ /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/AnsiballZ_command.py && sleep 0' 7557 1726882114.33987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.34025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882114.34028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882114.34031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882114.34037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.34078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.34082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.34136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.35876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.35903: stderr chunk (state=3): >>><<< 7557 1726882114.35907: stdout chunk (state=3): >>><<< 7557 1726882114.35921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.35924: _low_level_execute_command(): starting 7557 1726882114.35927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/AnsiballZ_command.py && sleep 0' 7557 1726882114.36412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.36415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.36418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882114.36421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.36468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.36471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.36476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.36528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.51832: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:28:34.511211", "end": "2024-09-20 21:28:34.514866", "delta": "0:00:00.003655", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882114.53063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.53076: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 7557 1726882114.53133: stderr chunk (state=3): >>><<< 7557 1726882114.53153: stdout chunk (state=3): >>><<< 7557 1726882114.53197: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:28:34.511211", "end": "2024-09-20 21:28:34.514866", "delta": "0:00:00.003655", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882114.53257: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882114.53287: _low_level_execute_command(): starting 7557 1726882114.53300: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882114.2783203-9132-213684523523373/ > /dev/null 2>&1 && sleep 0' 7557 1726882114.54014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882114.54043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882114.54069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.54128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882114.54164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.54176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.54216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.54285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.54302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.54417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.54512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.56297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.56322: stderr chunk (state=3): >>><<< 7557 1726882114.56326: stdout chunk (state=3): >>><<< 7557 1726882114.56347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.56352: handler run complete 7557 1726882114.56370: Evaluated conditional (False): False 7557 1726882114.56379: attempt loop complete, returning result 7557 1726882114.56382: _execute() done 7557 1726882114.56384: dumping result to json 7557 1726882114.56389: done dumping result, returning 7557 1726882114.56401: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [12673a56-9f93-ed48-b3a5-0000000000ff] 7557 1726882114.56406: sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ff 7557 1726882114.56507: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000000ff 7557 1726882114.56510: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003655", "end": "2024-09-20 21:28:34.514866", "rc": 0, "start": "2024-09-20 21:28:34.511211" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 7557 1726882114.56575: no more pending results, returning what we have 7557 1726882114.56579: results queue empty 7557 1726882114.56580: checking for any_errors_fatal 7557 1726882114.56586: done checking for any_errors_fatal 7557 1726882114.56586: checking for max_fail_percentage 7557 1726882114.56588: done checking for max_fail_percentage 7557 1726882114.56589: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.56590: done checking to see if all hosts have failed 7557 1726882114.56591: getting the remaining hosts for this loop 7557 1726882114.56592: done getting the remaining hosts for this loop 7557 1726882114.56597: getting the next task for host managed_node3 7557 1726882114.56603: done getting next task for host managed_node3 7557 1726882114.56606: ^ task is: TASK: Assert default ipv4 route is absent 7557 1726882114.56607: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.56611: getting variables 7557 1726882114.56613: in VariableManager get_vars() 7557 1726882114.56661: Calling all_inventory to load vars for managed_node3 7557 1726882114.56664: Calling groups_inventory to load vars for managed_node3 7557 1726882114.56666: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.56676: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.56678: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.56680: Calling groups_plugins_play to load vars for managed_node3 7557 1726882114.58013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.59524: done with get_vars() 7557 1726882114.59545: done getting variables 7557 1726882114.59621: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is absent] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:118 Friday 20 September 2024 21:28:34 -0400 (0:00:00.357) 0:00:40.449 ****** 7557 1726882114.59653: entering _queue_task() for managed_node3/assert 7557 1726882114.59969: worker is 1 (out of 1 available) 7557 1726882114.59983: exiting _queue_task() for managed_node3/assert 7557 1726882114.59997: done queuing things up, now waiting for results queue to drain 7557 1726882114.59999: waiting for pending results... 7557 1726882114.60251: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent 7557 1726882114.60312: in run() - task 12673a56-9f93-ed48-b3a5-000000000100 7557 1726882114.60316: variable 'ansible_search_path' from source: unknown 7557 1726882114.60368: calling self._execute() 7557 1726882114.60630: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.60634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.60638: variable 'omit' from source: magic vars 7557 1726882114.61191: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.61223: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.61235: variable 'omit' from source: magic vars 7557 1726882114.61259: variable 'omit' from source: magic vars 7557 1726882114.61315: variable 'omit' from source: magic vars 7557 1726882114.61358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882114.61412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882114.61435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882114.61456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.61474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.61519: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882114.61527: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.61533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.61653: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882114.61667: Set connection var ansible_shell_executable to /bin/sh 7557 1726882114.61675: Set connection var ansible_shell_type to sh 7557 1726882114.61685: Set connection var ansible_pipelining to False 7557 1726882114.61697: Set connection var ansible_connection to ssh 7557 1726882114.61722: Set connection var ansible_timeout to 10 7557 1726882114.61828: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.61831: variable 'ansible_connection' from source: unknown 7557 1726882114.61834: variable 'ansible_module_compression' from source: unknown 7557 1726882114.61838: variable 'ansible_shell_type' from source: unknown 7557 1726882114.61840: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.61842: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.61847: variable 'ansible_pipelining' from source: unknown 7557 1726882114.61849: variable 'ansible_timeout' from source: unknown 7557 1726882114.61852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.62006: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882114.62010: variable 'omit' from source: magic vars 7557 1726882114.62013: starting attempt loop 7557 1726882114.62015: running the handler 7557 1726882114.62084: variable '__test_str' from source: task vars 7557 1726882114.62138: variable 'interface' from source: play vars 7557 1726882114.62146: variable 'ipv4_routes' from source: set_fact 7557 1726882114.62159: Evaluated conditional (__test_str not in ipv4_routes.stdout): True 7557 1726882114.62165: handler run complete 7557 1726882114.62177: attempt loop complete, returning result 7557 1726882114.62180: _execute() done 7557 1726882114.62183: dumping result to json 7557 1726882114.62185: done dumping result, returning 7557 1726882114.62190: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent [12673a56-9f93-ed48-b3a5-000000000100] 7557 1726882114.62197: sending task result for task 12673a56-9f93-ed48-b3a5-000000000100 7557 1726882114.62280: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000100 7557 1726882114.62282: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882114.62330: no more pending results, returning what we have 7557 1726882114.62334: results queue empty 7557 1726882114.62335: checking for any_errors_fatal 7557 1726882114.62344: done checking for any_errors_fatal 7557 1726882114.62345: checking for max_fail_percentage 7557 1726882114.62347: done checking for max_fail_percentage 7557 1726882114.62347: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.62348: done checking to see if all hosts have failed 7557 1726882114.62349: getting the remaining hosts for this loop 7557 1726882114.62350: done getting the remaining hosts for this loop 7557 1726882114.62353: getting the next task for host managed_node3 7557 1726882114.62358: done getting next task for host managed_node3 7557 1726882114.62360: ^ task is: TASK: Get ipv6 routes 7557 1726882114.62362: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.62366: getting variables 7557 1726882114.62367: in VariableManager get_vars() 7557 1726882114.62420: Calling all_inventory to load vars for managed_node3 7557 1726882114.62424: Calling groups_inventory to load vars for managed_node3 7557 1726882114.62426: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.62435: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.62438: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.62440: Calling groups_plugins_play to load vars for managed_node3 7557 1726882114.63350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882114.64726: done with get_vars() 7557 1726882114.64745: done getting variables 7557 1726882114.64804: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:123 Friday 20 September 2024 21:28:34 -0400 (0:00:00.051) 0:00:40.501 ****** 7557 1726882114.64828: entering _queue_task() for managed_node3/command 7557 1726882114.65085: worker is 1 (out of 1 available) 7557 1726882114.65100: exiting _queue_task() for managed_node3/command 7557 1726882114.65116: done queuing things up, now waiting for results queue to drain 7557 1726882114.65117: waiting for pending results... 7557 1726882114.65414: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7557 1726882114.65447: in run() - task 12673a56-9f93-ed48-b3a5-000000000101 7557 1726882114.65469: variable 'ansible_search_path' from source: unknown 7557 1726882114.65518: calling self._execute() 7557 1726882114.65700: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.65704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.65707: variable 'omit' from source: magic vars 7557 1726882114.66029: variable 'ansible_distribution_major_version' from source: facts 7557 1726882114.66050: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882114.66064: variable 'omit' from source: magic vars 7557 1726882114.66088: variable 'omit' from source: magic vars 7557 1726882114.66143: variable 'omit' from source: magic vars 7557 1726882114.66201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882114.66279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882114.66282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882114.66296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.66310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882114.66340: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882114.66360: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.66363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.66467: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882114.66471: Set connection var ansible_shell_executable to /bin/sh 7557 1726882114.66474: Set connection var ansible_shell_type to sh 7557 1726882114.66476: Set connection var ansible_pipelining to False 7557 1726882114.66481: Set connection var ansible_connection to ssh 7557 1726882114.66484: Set connection var ansible_timeout to 10 7557 1726882114.66492: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.66502: variable 'ansible_connection' from source: unknown 7557 1726882114.66573: variable 'ansible_module_compression' from source: unknown 7557 1726882114.66575: variable 'ansible_shell_type' from source: unknown 7557 1726882114.66577: variable 'ansible_shell_executable' from source: unknown 7557 1726882114.66578: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882114.66580: variable 'ansible_pipelining' from source: unknown 7557 1726882114.66581: variable 'ansible_timeout' from source: unknown 7557 1726882114.66582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882114.66638: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882114.66650: variable 'omit' from source: magic vars 7557 1726882114.66656: starting attempt loop 7557 1726882114.66661: running the handler 7557 1726882114.66674: _low_level_execute_command(): starting 7557 1726882114.66687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882114.67174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.67192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.67255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.67261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.67263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.67312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.68945: stdout chunk (state=3): >>>/root <<< 7557 1726882114.69041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.69071: stderr chunk (state=3): >>><<< 7557 1726882114.69073: stdout chunk (state=3): >>><<< 7557 1726882114.69087: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.69158: _low_level_execute_command(): starting 7557 1726882114.69162: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653 `" && echo ansible-tmp-1726882114.6909595-9154-217240102069653="` echo /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653 `" ) && sleep 0' 7557 1726882114.69519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.69522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.69526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882114.69537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.69576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.69580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.69635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.71471: stdout chunk (state=3): >>>ansible-tmp-1726882114.6909595-9154-217240102069653=/root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653 <<< 7557 1726882114.71582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.71605: stderr chunk (state=3): >>><<< 7557 1726882114.71608: stdout chunk (state=3): >>><<< 7557 1726882114.71622: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882114.6909595-9154-217240102069653=/root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.71645: variable 'ansible_module_compression' from source: unknown 7557 1726882114.71683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882114.71719: variable 'ansible_facts' from source: unknown 7557 1726882114.71772: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/AnsiballZ_command.py 7557 1726882114.71877: Sending initial data 7557 1726882114.71880: Sent initial data (154 bytes) 7557 1726882114.72319: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.72322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882114.72324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.72327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.72329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.72376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.72379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.72441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.73962: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882114.74018: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882114.74061: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpvzdwhl9v /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/AnsiballZ_command.py <<< 7557 1726882114.74102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/AnsiballZ_command.py" <<< 7557 1726882114.74129: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpvzdwhl9v" to remote "/root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/AnsiballZ_command.py" <<< 7557 1726882114.74879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.74944: stderr chunk (state=3): >>><<< 7557 1726882114.74947: stdout chunk (state=3): >>><<< 7557 1726882114.74958: done transferring module to remote 7557 1726882114.74968: _low_level_execute_command(): starting 7557 1726882114.74973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/ /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/AnsiballZ_command.py && sleep 0' 7557 1726882114.75390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.75395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.75398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882114.75400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.75451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.75455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.75506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.77253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.77256: stdout chunk (state=3): >>><<< 7557 1726882114.77259: stderr chunk (state=3): >>><<< 7557 1726882114.77274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.77339: _low_level_execute_command(): starting 7557 1726882114.77342: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/AnsiballZ_command.py && sleep 0' 7557 1726882114.77936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882114.77948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882114.78009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.78072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.78102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.78116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.78199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.93483: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:28:34.929449", "end": "2024-09-20 21:28:34.933091", "delta": "0:00:00.003642", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882114.95097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882114.95101: stdout chunk (state=3): >>><<< 7557 1726882114.95104: stderr chunk (state=3): >>><<< 7557 1726882114.95247: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:28:34.929449", "end": "2024-09-20 21:28:34.933091", "delta": "0:00:00.003642", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882114.95251: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882114.95255: _low_level_execute_command(): starting 7557 1726882114.95257: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882114.6909595-9154-217240102069653/ > /dev/null 2>&1 && sleep 0' 7557 1726882114.95800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882114.95818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882114.95833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882114.95850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882114.95871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882114.95881: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882114.95991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882114.96002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882114.96022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882114.96045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882114.96114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882114.97972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882114.97976: stdout chunk (state=3): >>><<< 7557 1726882114.97990: stderr chunk (state=3): >>><<< 7557 1726882114.98010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882114.98016: handler run complete 7557 1726882114.98043: Evaluated conditional (False): False 7557 1726882114.98054: attempt loop complete, returning result 7557 1726882114.98056: _execute() done 7557 1726882114.98059: dumping result to json 7557 1726882114.98064: done dumping result, returning 7557 1726882114.98072: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [12673a56-9f93-ed48-b3a5-000000000101] 7557 1726882114.98078: sending task result for task 12673a56-9f93-ed48-b3a5-000000000101 7557 1726882114.98191: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000101 7557 1726882114.98198: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003642", "end": "2024-09-20 21:28:34.933091", "rc": 0, "start": "2024-09-20 21:28:34.929449" } STDOUT: 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium 7557 1726882114.98289: no more pending results, returning what we have 7557 1726882114.98296: results queue empty 7557 1726882114.98297: checking for any_errors_fatal 7557 1726882114.98304: done checking for any_errors_fatal 7557 1726882114.98305: checking for max_fail_percentage 7557 1726882114.98307: done checking for max_fail_percentage 7557 1726882114.98310: checking to see if all hosts have failed and the running result is not ok 7557 1726882114.98311: done checking to see if all hosts have failed 7557 1726882114.98312: getting the remaining hosts for this loop 7557 1726882114.98313: done getting the remaining hosts for this loop 7557 1726882114.98317: getting the next task for host managed_node3 7557 1726882114.98322: done getting next task for host managed_node3 7557 1726882114.98325: ^ task is: TASK: Assert default ipv6 route is absent 7557 1726882114.98327: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882114.98331: getting variables 7557 1726882114.98333: in VariableManager get_vars() 7557 1726882114.98392: Calling all_inventory to load vars for managed_node3 7557 1726882114.98602: Calling groups_inventory to load vars for managed_node3 7557 1726882114.98609: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882114.98621: Calling all_plugins_play to load vars for managed_node3 7557 1726882114.98624: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882114.98627: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.00587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.02334: done with get_vars() 7557 1726882115.02361: done getting variables 7557 1726882115.02435: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is absent] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:127 Friday 20 September 2024 21:28:35 -0400 (0:00:00.376) 0:00:40.877 ****** 7557 1726882115.02464: entering _queue_task() for managed_node3/assert 7557 1726882115.02869: worker is 1 (out of 1 available) 7557 1726882115.02881: exiting _queue_task() for managed_node3/assert 7557 1726882115.02952: done queuing things up, now waiting for results queue to drain 7557 1726882115.02954: waiting for pending results... 7557 1726882115.03415: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent 7557 1726882115.03420: in run() - task 12673a56-9f93-ed48-b3a5-000000000102 7557 1726882115.03423: variable 'ansible_search_path' from source: unknown 7557 1726882115.03426: calling self._execute() 7557 1726882115.03591: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.03600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.03619: variable 'omit' from source: magic vars 7557 1726882115.04500: variable 'ansible_distribution_major_version' from source: facts 7557 1726882115.04504: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882115.04674: variable 'network_provider' from source: set_fact 7557 1726882115.04678: Evaluated conditional (network_provider == "nm"): True 7557 1726882115.04681: variable 'omit' from source: magic vars 7557 1726882115.04899: variable 'omit' from source: magic vars 7557 1726882115.04903: variable 'omit' from source: magic vars 7557 1726882115.04905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882115.04931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882115.04980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882115.04984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882115.04987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882115.05122: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882115.05126: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.05128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.05247: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882115.05250: Set connection var ansible_shell_executable to /bin/sh 7557 1726882115.05253: Set connection var ansible_shell_type to sh 7557 1726882115.05255: Set connection var ansible_pipelining to False 7557 1726882115.05258: Set connection var ansible_connection to ssh 7557 1726882115.05265: Set connection var ansible_timeout to 10 7557 1726882115.05284: variable 'ansible_shell_executable' from source: unknown 7557 1726882115.05288: variable 'ansible_connection' from source: unknown 7557 1726882115.05290: variable 'ansible_module_compression' from source: unknown 7557 1726882115.05296: variable 'ansible_shell_type' from source: unknown 7557 1726882115.05299: variable 'ansible_shell_executable' from source: unknown 7557 1726882115.05301: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.05304: variable 'ansible_pipelining' from source: unknown 7557 1726882115.05306: variable 'ansible_timeout' from source: unknown 7557 1726882115.05308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.05663: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882115.05674: variable 'omit' from source: magic vars 7557 1726882115.05679: starting attempt loop 7557 1726882115.05682: running the handler 7557 1726882115.06103: variable '__test_str' from source: task vars 7557 1726882115.06289: variable 'interface' from source: play vars 7557 1726882115.06305: variable 'ipv6_route' from source: set_fact 7557 1726882115.06317: Evaluated conditional (__test_str not in ipv6_route.stdout): True 7557 1726882115.06322: handler run complete 7557 1726882115.06336: attempt loop complete, returning result 7557 1726882115.06339: _execute() done 7557 1726882115.06342: dumping result to json 7557 1726882115.06344: done dumping result, returning 7557 1726882115.06397: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent [12673a56-9f93-ed48-b3a5-000000000102] 7557 1726882115.06400: sending task result for task 12673a56-9f93-ed48-b3a5-000000000102 7557 1726882115.06568: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000102 7557 1726882115.06570: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7557 1726882115.06638: no more pending results, returning what we have 7557 1726882115.06643: results queue empty 7557 1726882115.06644: checking for any_errors_fatal 7557 1726882115.06656: done checking for any_errors_fatal 7557 1726882115.06657: checking for max_fail_percentage 7557 1726882115.06659: done checking for max_fail_percentage 7557 1726882115.06661: checking to see if all hosts have failed and the running result is not ok 7557 1726882115.06662: done checking to see if all hosts have failed 7557 1726882115.06662: getting the remaining hosts for this loop 7557 1726882115.06664: done getting the remaining hosts for this loop 7557 1726882115.06668: getting the next task for host managed_node3 7557 1726882115.06674: done getting next task for host managed_node3 7557 1726882115.06678: ^ task is: TASK: TEARDOWN: remove profiles. 7557 1726882115.06680: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882115.06683: getting variables 7557 1726882115.06685: in VariableManager get_vars() 7557 1726882115.06747: Calling all_inventory to load vars for managed_node3 7557 1726882115.06750: Calling groups_inventory to load vars for managed_node3 7557 1726882115.06754: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882115.06766: Calling all_plugins_play to load vars for managed_node3 7557 1726882115.06769: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882115.06772: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.10096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.12320: done with get_vars() 7557 1726882115.12347: done getting variables 7557 1726882115.12415: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:133 Friday 20 September 2024 21:28:35 -0400 (0:00:00.099) 0:00:40.977 ****** 7557 1726882115.12446: entering _queue_task() for managed_node3/debug 7557 1726882115.12919: worker is 1 (out of 1 available) 7557 1726882115.12932: exiting _queue_task() for managed_node3/debug 7557 1726882115.12944: done queuing things up, now waiting for results queue to drain 7557 1726882115.12946: waiting for pending results... 7557 1726882115.13513: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7557 1726882115.13522: in run() - task 12673a56-9f93-ed48-b3a5-000000000103 7557 1726882115.13526: variable 'ansible_search_path' from source: unknown 7557 1726882115.13529: calling self._execute() 7557 1726882115.13531: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.13534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.13536: variable 'omit' from source: magic vars 7557 1726882115.13850: variable 'ansible_distribution_major_version' from source: facts 7557 1726882115.13863: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882115.13869: variable 'omit' from source: magic vars 7557 1726882115.13895: variable 'omit' from source: magic vars 7557 1726882115.13934: variable 'omit' from source: magic vars 7557 1726882115.13976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882115.14021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882115.14039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882115.14057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882115.14073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882115.14298: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882115.14301: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.14304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.14306: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882115.14309: Set connection var ansible_shell_executable to /bin/sh 7557 1726882115.14312: Set connection var ansible_shell_type to sh 7557 1726882115.14314: Set connection var ansible_pipelining to False 7557 1726882115.14315: Set connection var ansible_connection to ssh 7557 1726882115.14318: Set connection var ansible_timeout to 10 7557 1726882115.14320: variable 'ansible_shell_executable' from source: unknown 7557 1726882115.14322: variable 'ansible_connection' from source: unknown 7557 1726882115.14325: variable 'ansible_module_compression' from source: unknown 7557 1726882115.14327: variable 'ansible_shell_type' from source: unknown 7557 1726882115.14329: variable 'ansible_shell_executable' from source: unknown 7557 1726882115.14331: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.14332: variable 'ansible_pipelining' from source: unknown 7557 1726882115.14334: variable 'ansible_timeout' from source: unknown 7557 1726882115.14336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.14462: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882115.14599: variable 'omit' from source: magic vars 7557 1726882115.14602: starting attempt loop 7557 1726882115.14605: running the handler 7557 1726882115.14608: handler run complete 7557 1726882115.14611: attempt loop complete, returning result 7557 1726882115.14613: _execute() done 7557 1726882115.14614: dumping result to json 7557 1726882115.14617: done dumping result, returning 7557 1726882115.14619: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [12673a56-9f93-ed48-b3a5-000000000103] 7557 1726882115.14621: sending task result for task 12673a56-9f93-ed48-b3a5-000000000103 7557 1726882115.14682: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000103 7557 1726882115.14685: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7557 1726882115.14738: no more pending results, returning what we have 7557 1726882115.14742: results queue empty 7557 1726882115.14744: checking for any_errors_fatal 7557 1726882115.14757: done checking for any_errors_fatal 7557 1726882115.14759: checking for max_fail_percentage 7557 1726882115.14761: done checking for max_fail_percentage 7557 1726882115.14762: checking to see if all hosts have failed and the running result is not ok 7557 1726882115.14763: done checking to see if all hosts have failed 7557 1726882115.14764: getting the remaining hosts for this loop 7557 1726882115.14765: done getting the remaining hosts for this loop 7557 1726882115.14769: getting the next task for host managed_node3 7557 1726882115.14777: done getting next task for host managed_node3 7557 1726882115.14783: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882115.14787: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882115.14815: getting variables 7557 1726882115.14817: in VariableManager get_vars() 7557 1726882115.14982: Calling all_inventory to load vars for managed_node3 7557 1726882115.14986: Calling groups_inventory to load vars for managed_node3 7557 1726882115.14989: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882115.15003: Calling all_plugins_play to load vars for managed_node3 7557 1726882115.15007: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882115.15010: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.16764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.20602: done with get_vars() 7557 1726882115.20632: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:28:35 -0400 (0:00:00.082) 0:00:41.060 ****** 7557 1726882115.20864: entering _queue_task() for managed_node3/include_tasks 7557 1726882115.21585: worker is 1 (out of 1 available) 7557 1726882115.21602: exiting _queue_task() for managed_node3/include_tasks 7557 1726882115.21690: done queuing things up, now waiting for results queue to drain 7557 1726882115.21692: waiting for pending results... 7557 1726882115.22210: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7557 1726882115.22600: in run() - task 12673a56-9f93-ed48-b3a5-00000000010b 7557 1726882115.22604: variable 'ansible_search_path' from source: unknown 7557 1726882115.22608: variable 'ansible_search_path' from source: unknown 7557 1726882115.22611: calling self._execute() 7557 1726882115.22614: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.22616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.22618: variable 'omit' from source: magic vars 7557 1726882115.23599: variable 'ansible_distribution_major_version' from source: facts 7557 1726882115.23603: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882115.23606: _execute() done 7557 1726882115.23609: dumping result to json 7557 1726882115.23611: done dumping result, returning 7557 1726882115.23614: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-ed48-b3a5-00000000010b] 7557 1726882115.23616: sending task result for task 12673a56-9f93-ed48-b3a5-00000000010b 7557 1726882115.23737: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000010b 7557 1726882115.23742: WORKER PROCESS EXITING 7557 1726882115.23759: no more pending results, returning what we have 7557 1726882115.23764: in VariableManager get_vars() 7557 1726882115.23825: Calling all_inventory to load vars for managed_node3 7557 1726882115.23828: Calling groups_inventory to load vars for managed_node3 7557 1726882115.23831: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882115.23931: Calling all_plugins_play to load vars for managed_node3 7557 1726882115.23936: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882115.23939: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.25327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.27012: done with get_vars() 7557 1726882115.27046: variable 'ansible_search_path' from source: unknown 7557 1726882115.27047: variable 'ansible_search_path' from source: unknown 7557 1726882115.27089: we have included files to process 7557 1726882115.27090: generating all_blocks data 7557 1726882115.27096: done generating all_blocks data 7557 1726882115.27103: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882115.27105: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882115.27107: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7557 1726882115.27777: done processing included file 7557 1726882115.27780: iterating over new_blocks loaded from include file 7557 1726882115.27782: in VariableManager get_vars() 7557 1726882115.27826: done with get_vars() 7557 1726882115.27828: filtering new block on tags 7557 1726882115.27846: done filtering new block on tags 7557 1726882115.27849: in VariableManager get_vars() 7557 1726882115.27877: done with get_vars() 7557 1726882115.27879: filtering new block on tags 7557 1726882115.27911: done filtering new block on tags 7557 1726882115.27915: in VariableManager get_vars() 7557 1726882115.27944: done with get_vars() 7557 1726882115.27945: filtering new block on tags 7557 1726882115.27963: done filtering new block on tags 7557 1726882115.27965: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7557 1726882115.27971: extending task lists for all hosts with included blocks 7557 1726882115.28883: done extending task lists 7557 1726882115.28885: done processing included files 7557 1726882115.28886: results queue empty 7557 1726882115.28887: checking for any_errors_fatal 7557 1726882115.28891: done checking for any_errors_fatal 7557 1726882115.28896: checking for max_fail_percentage 7557 1726882115.28897: done checking for max_fail_percentage 7557 1726882115.28898: checking to see if all hosts have failed and the running result is not ok 7557 1726882115.28899: done checking to see if all hosts have failed 7557 1726882115.28900: getting the remaining hosts for this loop 7557 1726882115.28901: done getting the remaining hosts for this loop 7557 1726882115.28904: getting the next task for host managed_node3 7557 1726882115.28908: done getting next task for host managed_node3 7557 1726882115.28911: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882115.28914: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882115.28925: getting variables 7557 1726882115.28927: in VariableManager get_vars() 7557 1726882115.28947: Calling all_inventory to load vars for managed_node3 7557 1726882115.28949: Calling groups_inventory to load vars for managed_node3 7557 1726882115.28951: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882115.28956: Calling all_plugins_play to load vars for managed_node3 7557 1726882115.28959: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882115.28962: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.30263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.32584: done with get_vars() 7557 1726882115.32612: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:28:35 -0400 (0:00:00.119) 0:00:41.179 ****** 7557 1726882115.32701: entering _queue_task() for managed_node3/setup 7557 1726882115.33047: worker is 1 (out of 1 available) 7557 1726882115.33060: exiting _queue_task() for managed_node3/setup 7557 1726882115.33072: done queuing things up, now waiting for results queue to drain 7557 1726882115.33074: waiting for pending results... 7557 1726882115.33397: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7557 1726882115.33801: in run() - task 12673a56-9f93-ed48-b3a5-0000000019b6 7557 1726882115.33806: variable 'ansible_search_path' from source: unknown 7557 1726882115.33810: variable 'ansible_search_path' from source: unknown 7557 1726882115.33813: calling self._execute() 7557 1726882115.33816: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.33818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.33821: variable 'omit' from source: magic vars 7557 1726882115.34141: variable 'ansible_distribution_major_version' from source: facts 7557 1726882115.34154: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882115.34526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882115.36792: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882115.36877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882115.36923: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882115.36955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882115.36980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882115.37069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882115.37099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882115.37128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882115.37173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882115.37188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882115.37251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882115.37273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882115.37302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882115.37345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882115.37363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882115.37533: variable '__network_required_facts' from source: role '' defaults 7557 1726882115.37543: variable 'ansible_facts' from source: unknown 7557 1726882115.38365: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7557 1726882115.38368: when evaluation is False, skipping this task 7557 1726882115.38371: _execute() done 7557 1726882115.38373: dumping result to json 7557 1726882115.38376: done dumping result, returning 7557 1726882115.38379: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-ed48-b3a5-0000000019b6] 7557 1726882115.38381: sending task result for task 12673a56-9f93-ed48-b3a5-0000000019b6 7557 1726882115.38481: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000019b6 7557 1726882115.38484: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882115.38588: no more pending results, returning what we have 7557 1726882115.38597: results queue empty 7557 1726882115.38598: checking for any_errors_fatal 7557 1726882115.38600: done checking for any_errors_fatal 7557 1726882115.38600: checking for max_fail_percentage 7557 1726882115.38602: done checking for max_fail_percentage 7557 1726882115.38603: checking to see if all hosts have failed and the running result is not ok 7557 1726882115.38604: done checking to see if all hosts have failed 7557 1726882115.38605: getting the remaining hosts for this loop 7557 1726882115.38607: done getting the remaining hosts for this loop 7557 1726882115.38611: getting the next task for host managed_node3 7557 1726882115.38622: done getting next task for host managed_node3 7557 1726882115.38626: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882115.38631: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882115.38770: getting variables 7557 1726882115.38772: in VariableManager get_vars() 7557 1726882115.38836: Calling all_inventory to load vars for managed_node3 7557 1726882115.38840: Calling groups_inventory to load vars for managed_node3 7557 1726882115.38842: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882115.38853: Calling all_plugins_play to load vars for managed_node3 7557 1726882115.38857: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882115.38860: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.40653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.42252: done with get_vars() 7557 1726882115.42285: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:28:35 -0400 (0:00:00.096) 0:00:41.276 ****** 7557 1726882115.42408: entering _queue_task() for managed_node3/stat 7557 1726882115.42777: worker is 1 (out of 1 available) 7557 1726882115.42789: exiting _queue_task() for managed_node3/stat 7557 1726882115.43009: done queuing things up, now waiting for results queue to drain 7557 1726882115.43011: waiting for pending results... 7557 1726882115.43332: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7557 1726882115.43338: in run() - task 12673a56-9f93-ed48-b3a5-0000000019b8 7557 1726882115.43342: variable 'ansible_search_path' from source: unknown 7557 1726882115.43346: variable 'ansible_search_path' from source: unknown 7557 1726882115.43349: calling self._execute() 7557 1726882115.43450: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.43457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.43473: variable 'omit' from source: magic vars 7557 1726882115.43872: variable 'ansible_distribution_major_version' from source: facts 7557 1726882115.43884: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882115.44065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882115.44357: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882115.44411: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882115.44517: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882115.44521: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882115.44579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882115.44609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882115.44643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882115.44668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882115.44763: variable '__network_is_ostree' from source: set_fact 7557 1726882115.44767: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882115.44770: when evaluation is False, skipping this task 7557 1726882115.44779: _execute() done 7557 1726882115.44783: dumping result to json 7557 1726882115.44785: done dumping result, returning 7557 1726882115.44797: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-ed48-b3a5-0000000019b8] 7557 1726882115.44843: sending task result for task 12673a56-9f93-ed48-b3a5-0000000019b8 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882115.44965: no more pending results, returning what we have 7557 1726882115.44969: results queue empty 7557 1726882115.44970: checking for any_errors_fatal 7557 1726882115.44980: done checking for any_errors_fatal 7557 1726882115.44981: checking for max_fail_percentage 7557 1726882115.44983: done checking for max_fail_percentage 7557 1726882115.44984: checking to see if all hosts have failed and the running result is not ok 7557 1726882115.44985: done checking to see if all hosts have failed 7557 1726882115.44985: getting the remaining hosts for this loop 7557 1726882115.44988: done getting the remaining hosts for this loop 7557 1726882115.44996: getting the next task for host managed_node3 7557 1726882115.45005: done getting next task for host managed_node3 7557 1726882115.45009: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882115.45013: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882115.45041: getting variables 7557 1726882115.45043: in VariableManager get_vars() 7557 1726882115.45314: Calling all_inventory to load vars for managed_node3 7557 1726882115.45317: Calling groups_inventory to load vars for managed_node3 7557 1726882115.45320: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882115.45331: Calling all_plugins_play to load vars for managed_node3 7557 1726882115.45334: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882115.45338: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.45987: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000019b8 7557 1726882115.45991: WORKER PROCESS EXITING 7557 1726882115.46920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.48539: done with get_vars() 7557 1726882115.48568: done getting variables 7557 1726882115.48636: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:28:35 -0400 (0:00:00.062) 0:00:41.339 ****** 7557 1726882115.48675: entering _queue_task() for managed_node3/set_fact 7557 1726882115.49041: worker is 1 (out of 1 available) 7557 1726882115.49053: exiting _queue_task() for managed_node3/set_fact 7557 1726882115.49071: done queuing things up, now waiting for results queue to drain 7557 1726882115.49073: waiting for pending results... 7557 1726882115.49412: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7557 1726882115.49569: in run() - task 12673a56-9f93-ed48-b3a5-0000000019b9 7557 1726882115.49590: variable 'ansible_search_path' from source: unknown 7557 1726882115.49697: variable 'ansible_search_path' from source: unknown 7557 1726882115.49702: calling self._execute() 7557 1726882115.49760: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.49772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.49786: variable 'omit' from source: magic vars 7557 1726882115.50174: variable 'ansible_distribution_major_version' from source: facts 7557 1726882115.50192: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882115.50374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882115.50660: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882115.50715: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882115.50751: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882115.50789: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882115.50880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882115.50917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882115.50949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882115.50983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882115.51134: variable '__network_is_ostree' from source: set_fact 7557 1726882115.51137: Evaluated conditional (not __network_is_ostree is defined): False 7557 1726882115.51139: when evaluation is False, skipping this task 7557 1726882115.51141: _execute() done 7557 1726882115.51143: dumping result to json 7557 1726882115.51146: done dumping result, returning 7557 1726882115.51149: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-ed48-b3a5-0000000019b9] 7557 1726882115.51151: sending task result for task 12673a56-9f93-ed48-b3a5-0000000019b9 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7557 1726882115.51445: no more pending results, returning what we have 7557 1726882115.51449: results queue empty 7557 1726882115.51451: checking for any_errors_fatal 7557 1726882115.51458: done checking for any_errors_fatal 7557 1726882115.51458: checking for max_fail_percentage 7557 1726882115.51460: done checking for max_fail_percentage 7557 1726882115.51461: checking to see if all hosts have failed and the running result is not ok 7557 1726882115.51462: done checking to see if all hosts have failed 7557 1726882115.51463: getting the remaining hosts for this loop 7557 1726882115.51465: done getting the remaining hosts for this loop 7557 1726882115.51468: getting the next task for host managed_node3 7557 1726882115.51479: done getting next task for host managed_node3 7557 1726882115.51482: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882115.51486: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882115.51515: getting variables 7557 1726882115.51517: in VariableManager get_vars() 7557 1726882115.51569: Calling all_inventory to load vars for managed_node3 7557 1726882115.51572: Calling groups_inventory to load vars for managed_node3 7557 1726882115.51574: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882115.51584: Calling all_plugins_play to load vars for managed_node3 7557 1726882115.51587: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882115.51590: Calling groups_plugins_play to load vars for managed_node3 7557 1726882115.52414: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000019b9 7557 1726882115.52418: WORKER PROCESS EXITING 7557 1726882115.53477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882115.55409: done with get_vars() 7557 1726882115.55434: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:28:35 -0400 (0:00:00.068) 0:00:41.408 ****** 7557 1726882115.55554: entering _queue_task() for managed_node3/service_facts 7557 1726882115.56071: worker is 1 (out of 1 available) 7557 1726882115.56084: exiting _queue_task() for managed_node3/service_facts 7557 1726882115.56182: done queuing things up, now waiting for results queue to drain 7557 1726882115.56184: waiting for pending results... 7557 1726882115.56762: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7557 1726882115.57253: in run() - task 12673a56-9f93-ed48-b3a5-0000000019bb 7557 1726882115.57279: variable 'ansible_search_path' from source: unknown 7557 1726882115.57288: variable 'ansible_search_path' from source: unknown 7557 1726882115.57330: calling self._execute() 7557 1726882115.57471: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.57701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.57706: variable 'omit' from source: magic vars 7557 1726882115.58161: variable 'ansible_distribution_major_version' from source: facts 7557 1726882115.58181: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882115.58197: variable 'omit' from source: magic vars 7557 1726882115.58291: variable 'omit' from source: magic vars 7557 1726882115.58337: variable 'omit' from source: magic vars 7557 1726882115.58390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882115.58438: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882115.58474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882115.58500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882115.58520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882115.58557: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882115.58571: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.58581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.58701: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882115.58717: Set connection var ansible_shell_executable to /bin/sh 7557 1726882115.58726: Set connection var ansible_shell_type to sh 7557 1726882115.58738: Set connection var ansible_pipelining to False 7557 1726882115.58745: Set connection var ansible_connection to ssh 7557 1726882115.58757: Set connection var ansible_timeout to 10 7557 1726882115.58790: variable 'ansible_shell_executable' from source: unknown 7557 1726882115.58804: variable 'ansible_connection' from source: unknown 7557 1726882115.58900: variable 'ansible_module_compression' from source: unknown 7557 1726882115.58903: variable 'ansible_shell_type' from source: unknown 7557 1726882115.58909: variable 'ansible_shell_executable' from source: unknown 7557 1726882115.58911: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882115.58913: variable 'ansible_pipelining' from source: unknown 7557 1726882115.58915: variable 'ansible_timeout' from source: unknown 7557 1726882115.58917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882115.59069: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882115.59080: variable 'omit' from source: magic vars 7557 1726882115.59083: starting attempt loop 7557 1726882115.59086: running the handler 7557 1726882115.59108: _low_level_execute_command(): starting 7557 1726882115.59116: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882115.59961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882115.59964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882115.59966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882115.59968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882115.61588: stdout chunk (state=3): >>>/root <<< 7557 1726882115.61739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882115.61742: stdout chunk (state=3): >>><<< 7557 1726882115.61745: stderr chunk (state=3): >>><<< 7557 1726882115.61856: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882115.61861: _low_level_execute_command(): starting 7557 1726882115.61863: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699 `" && echo ansible-tmp-1726882115.617662-9191-15724618725699="` echo /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699 `" ) && sleep 0' 7557 1726882115.62380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882115.62395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882115.62414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882115.62438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882115.62453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882115.62540: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882115.62571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882115.62595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882115.62610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882115.62686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882115.64762: stdout chunk (state=3): >>>ansible-tmp-1726882115.617662-9191-15724618725699=/root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699 <<< 7557 1726882115.64766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882115.64768: stdout chunk (state=3): >>><<< 7557 1726882115.64770: stderr chunk (state=3): >>><<< 7557 1726882115.64773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882115.617662-9191-15724618725699=/root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882115.64899: variable 'ansible_module_compression' from source: unknown 7557 1726882115.65112: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7557 1726882115.65210: variable 'ansible_facts' from source: unknown 7557 1726882115.65251: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/AnsiballZ_service_facts.py 7557 1726882115.65649: Sending initial data 7557 1726882115.65676: Sent initial data (158 bytes) 7557 1726882115.66311: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882115.66325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882115.66422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882115.66454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882115.66489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882115.66574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882115.68345: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882115.68370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882115.68441: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpucm80l86 /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/AnsiballZ_service_facts.py <<< 7557 1726882115.68451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/AnsiballZ_service_facts.py" <<< 7557 1726882115.68517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpucm80l86" to remote "/root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/AnsiballZ_service_facts.py" <<< 7557 1726882115.69216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882115.69284: stderr chunk (state=3): >>><<< 7557 1726882115.69300: stdout chunk (state=3): >>><<< 7557 1726882115.69322: done transferring module to remote 7557 1726882115.69334: _low_level_execute_command(): starting 7557 1726882115.69342: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/ /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/AnsiballZ_service_facts.py && sleep 0' 7557 1726882115.69925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882115.69945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882115.70007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882115.70063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882115.70088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882115.70168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882115.71884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882115.71898: stdout chunk (state=3): >>><<< 7557 1726882115.71907: stderr chunk (state=3): >>><<< 7557 1726882115.71919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882115.71922: _low_level_execute_command(): starting 7557 1726882115.71927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/AnsiballZ_service_facts.py && sleep 0' 7557 1726882115.72334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882115.72337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882115.72339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882115.72341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882115.72399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882115.72402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882115.72447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.19003: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 7557 1726882117.19033: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_sta<<< 7557 1726882117.19046: stdout chunk (state=3): >>>t.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7557 1726882117.20536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882117.20564: stderr chunk (state=3): >>><<< 7557 1726882117.20567: stdout chunk (state=3): >>><<< 7557 1726882117.20596: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882117.21044: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882117.21051: _low_level_execute_command(): starting 7557 1726882117.21058: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882115.617662-9191-15724618725699/ > /dev/null 2>&1 && sleep 0' 7557 1726882117.21534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.21537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.21540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882117.21542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882117.21544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.21602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882117.21608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882117.21610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882117.21653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.23379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882117.23412: stderr chunk (state=3): >>><<< 7557 1726882117.23416: stdout chunk (state=3): >>><<< 7557 1726882117.23431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882117.23437: handler run complete 7557 1726882117.23557: variable 'ansible_facts' from source: unknown 7557 1726882117.23653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882117.23927: variable 'ansible_facts' from source: unknown 7557 1726882117.24009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882117.24124: attempt loop complete, returning result 7557 1726882117.24127: _execute() done 7557 1726882117.24130: dumping result to json 7557 1726882117.24164: done dumping result, returning 7557 1726882117.24172: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-ed48-b3a5-0000000019bb] 7557 1726882117.24182: sending task result for task 12673a56-9f93-ed48-b3a5-0000000019bb ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882117.24789: no more pending results, returning what we have 7557 1726882117.24791: results queue empty 7557 1726882117.24796: checking for any_errors_fatal 7557 1726882117.24801: done checking for any_errors_fatal 7557 1726882117.24802: checking for max_fail_percentage 7557 1726882117.24808: done checking for max_fail_percentage 7557 1726882117.24808: checking to see if all hosts have failed and the running result is not ok 7557 1726882117.24809: done checking to see if all hosts have failed 7557 1726882117.24810: getting the remaining hosts for this loop 7557 1726882117.24811: done getting the remaining hosts for this loop 7557 1726882117.24814: getting the next task for host managed_node3 7557 1726882117.24821: done getting next task for host managed_node3 7557 1726882117.24824: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882117.24827: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882117.24835: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000019bb 7557 1726882117.24838: WORKER PROCESS EXITING 7557 1726882117.24845: getting variables 7557 1726882117.24846: in VariableManager get_vars() 7557 1726882117.24877: Calling all_inventory to load vars for managed_node3 7557 1726882117.24878: Calling groups_inventory to load vars for managed_node3 7557 1726882117.24880: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882117.24886: Calling all_plugins_play to load vars for managed_node3 7557 1726882117.24888: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882117.24890: Calling groups_plugins_play to load vars for managed_node3 7557 1726882117.25714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882117.26580: done with get_vars() 7557 1726882117.26601: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:28:37 -0400 (0:00:01.711) 0:00:43.119 ****** 7557 1726882117.26676: entering _queue_task() for managed_node3/package_facts 7557 1726882117.26927: worker is 1 (out of 1 available) 7557 1726882117.26939: exiting _queue_task() for managed_node3/package_facts 7557 1726882117.26952: done queuing things up, now waiting for results queue to drain 7557 1726882117.26954: waiting for pending results... 7557 1726882117.27139: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7557 1726882117.27242: in run() - task 12673a56-9f93-ed48-b3a5-0000000019bc 7557 1726882117.27255: variable 'ansible_search_path' from source: unknown 7557 1726882117.27258: variable 'ansible_search_path' from source: unknown 7557 1726882117.27286: calling self._execute() 7557 1726882117.27365: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882117.27369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882117.27377: variable 'omit' from source: magic vars 7557 1726882117.27659: variable 'ansible_distribution_major_version' from source: facts 7557 1726882117.27669: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882117.27675: variable 'omit' from source: magic vars 7557 1726882117.27725: variable 'omit' from source: magic vars 7557 1726882117.27754: variable 'omit' from source: magic vars 7557 1726882117.27785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882117.27815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882117.27831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882117.27849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882117.27858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882117.27882: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882117.27885: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882117.27888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882117.27962: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882117.27968: Set connection var ansible_shell_executable to /bin/sh 7557 1726882117.27971: Set connection var ansible_shell_type to sh 7557 1726882117.27976: Set connection var ansible_pipelining to False 7557 1726882117.27978: Set connection var ansible_connection to ssh 7557 1726882117.27983: Set connection var ansible_timeout to 10 7557 1726882117.28004: variable 'ansible_shell_executable' from source: unknown 7557 1726882117.28007: variable 'ansible_connection' from source: unknown 7557 1726882117.28010: variable 'ansible_module_compression' from source: unknown 7557 1726882117.28012: variable 'ansible_shell_type' from source: unknown 7557 1726882117.28014: variable 'ansible_shell_executable' from source: unknown 7557 1726882117.28016: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882117.28021: variable 'ansible_pipelining' from source: unknown 7557 1726882117.28023: variable 'ansible_timeout' from source: unknown 7557 1726882117.28028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882117.28173: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882117.28182: variable 'omit' from source: magic vars 7557 1726882117.28186: starting attempt loop 7557 1726882117.28188: running the handler 7557 1726882117.28203: _low_level_execute_command(): starting 7557 1726882117.28210: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882117.28733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.28736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.28739: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.28741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.28789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882117.28792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882117.28798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882117.28854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.30386: stdout chunk (state=3): >>>/root <<< 7557 1726882117.30485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882117.30519: stderr chunk (state=3): >>><<< 7557 1726882117.30522: stdout chunk (state=3): >>><<< 7557 1726882117.30543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882117.30557: _low_level_execute_command(): starting 7557 1726882117.30563: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755 `" && echo ansible-tmp-1726882117.3054273-9234-238742247269755="` echo /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755 `" ) && sleep 0' 7557 1726882117.30996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.31026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.31037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.31040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.31088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882117.31098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882117.31101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882117.31140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.32949: stdout chunk (state=3): >>>ansible-tmp-1726882117.3054273-9234-238742247269755=/root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755 <<< 7557 1726882117.33055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882117.33080: stderr chunk (state=3): >>><<< 7557 1726882117.33083: stdout chunk (state=3): >>><<< 7557 1726882117.33102: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882117.3054273-9234-238742247269755=/root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882117.33142: variable 'ansible_module_compression' from source: unknown 7557 1726882117.33184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7557 1726882117.33240: variable 'ansible_facts' from source: unknown 7557 1726882117.33359: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/AnsiballZ_package_facts.py 7557 1726882117.33470: Sending initial data 7557 1726882117.33473: Sent initial data (160 bytes) 7557 1726882117.33940: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882117.33944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882117.33946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.33948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.33950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.33999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882117.34002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882117.34005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882117.34058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.35550: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7557 1726882117.35553: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882117.35588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882117.35639: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpk211mtyr /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/AnsiballZ_package_facts.py <<< 7557 1726882117.35642: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/AnsiballZ_package_facts.py" <<< 7557 1726882117.35682: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpk211mtyr" to remote "/root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/AnsiballZ_package_facts.py" <<< 7557 1726882117.35688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/AnsiballZ_package_facts.py" <<< 7557 1726882117.36721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882117.36764: stderr chunk (state=3): >>><<< 7557 1726882117.36767: stdout chunk (state=3): >>><<< 7557 1726882117.36794: done transferring module to remote 7557 1726882117.36806: _low_level_execute_command(): starting 7557 1726882117.36811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/ /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/AnsiballZ_package_facts.py && sleep 0' 7557 1726882117.37256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882117.37259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882117.37262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.37264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.37270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.37329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882117.37331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882117.37333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882117.37374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.39067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882117.39095: stderr chunk (state=3): >>><<< 7557 1726882117.39099: stdout chunk (state=3): >>><<< 7557 1726882117.39115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882117.39118: _low_level_execute_command(): starting 7557 1726882117.39120: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/AnsiballZ_package_facts.py && sleep 0' 7557 1726882117.39570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.39573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882117.39576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882117.39578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.39631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882117.39634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882117.39638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882117.39688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.83284: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 7557 1726882117.83309: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 7557 1726882117.83345: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 7557 1726882117.83368: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 7557 1726882117.83431: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 7557 1726882117.83439: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 7557 1726882117.83441: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 7557 1726882117.83448: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 7557 1726882117.83453: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 7557 1726882117.83458: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 7557 1726882117.83470: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 7557 1726882117.83487: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 7557 1726882117.83502: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7557 1726882117.85233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882117.85266: stderr chunk (state=3): >>><<< 7557 1726882117.85269: stdout chunk (state=3): >>><<< 7557 1726882117.85308: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882117.86597: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882117.86615: _low_level_execute_command(): starting 7557 1726882117.86619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882117.3054273-9234-238742247269755/ > /dev/null 2>&1 && sleep 0' 7557 1726882117.87077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882117.87081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882117.87084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.87086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882117.87088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882117.87142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882117.87149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882117.87151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882117.87201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882117.88974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882117.89001: stderr chunk (state=3): >>><<< 7557 1726882117.89004: stdout chunk (state=3): >>><<< 7557 1726882117.89015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882117.89020: handler run complete 7557 1726882117.89461: variable 'ansible_facts' from source: unknown 7557 1726882117.89721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882117.90744: variable 'ansible_facts' from source: unknown 7557 1726882117.90979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882117.91359: attempt loop complete, returning result 7557 1726882117.91368: _execute() done 7557 1726882117.91372: dumping result to json 7557 1726882117.91574: done dumping result, returning 7557 1726882117.91578: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-ed48-b3a5-0000000019bc] 7557 1726882117.91581: sending task result for task 12673a56-9f93-ed48-b3a5-0000000019bc 7557 1726882117.92811: done sending task result for task 12673a56-9f93-ed48-b3a5-0000000019bc 7557 1726882117.92814: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882117.92907: no more pending results, returning what we have 7557 1726882117.92910: results queue empty 7557 1726882117.92910: checking for any_errors_fatal 7557 1726882117.92915: done checking for any_errors_fatal 7557 1726882117.92915: checking for max_fail_percentage 7557 1726882117.92917: done checking for max_fail_percentage 7557 1726882117.92917: checking to see if all hosts have failed and the running result is not ok 7557 1726882117.92918: done checking to see if all hosts have failed 7557 1726882117.92918: getting the remaining hosts for this loop 7557 1726882117.92919: done getting the remaining hosts for this loop 7557 1726882117.92921: getting the next task for host managed_node3 7557 1726882117.92927: done getting next task for host managed_node3 7557 1726882117.92929: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882117.92931: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882117.92939: getting variables 7557 1726882117.92939: in VariableManager get_vars() 7557 1726882117.92970: Calling all_inventory to load vars for managed_node3 7557 1726882117.92972: Calling groups_inventory to load vars for managed_node3 7557 1726882117.92973: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882117.92980: Calling all_plugins_play to load vars for managed_node3 7557 1726882117.92981: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882117.92983: Calling groups_plugins_play to load vars for managed_node3 7557 1726882117.93668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882117.94605: done with get_vars() 7557 1726882117.94621: done getting variables 7557 1726882117.94664: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:28:37 -0400 (0:00:00.680) 0:00:43.799 ****** 7557 1726882117.94699: entering _queue_task() for managed_node3/debug 7557 1726882117.94942: worker is 1 (out of 1 available) 7557 1726882117.94955: exiting _queue_task() for managed_node3/debug 7557 1726882117.94969: done queuing things up, now waiting for results queue to drain 7557 1726882117.94970: waiting for pending results... 7557 1726882117.95152: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7557 1726882117.95250: in run() - task 12673a56-9f93-ed48-b3a5-00000000010c 7557 1726882117.95262: variable 'ansible_search_path' from source: unknown 7557 1726882117.95265: variable 'ansible_search_path' from source: unknown 7557 1726882117.95300: calling self._execute() 7557 1726882117.95375: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882117.95379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882117.95388: variable 'omit' from source: magic vars 7557 1726882117.95666: variable 'ansible_distribution_major_version' from source: facts 7557 1726882117.95676: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882117.95681: variable 'omit' from source: magic vars 7557 1726882117.95721: variable 'omit' from source: magic vars 7557 1726882117.95790: variable 'network_provider' from source: set_fact 7557 1726882117.95808: variable 'omit' from source: magic vars 7557 1726882117.95841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882117.95871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882117.95886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882117.95902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882117.95912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882117.95935: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882117.95938: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882117.95941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882117.96014: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882117.96021: Set connection var ansible_shell_executable to /bin/sh 7557 1726882117.96024: Set connection var ansible_shell_type to sh 7557 1726882117.96029: Set connection var ansible_pipelining to False 7557 1726882117.96031: Set connection var ansible_connection to ssh 7557 1726882117.96036: Set connection var ansible_timeout to 10 7557 1726882117.96053: variable 'ansible_shell_executable' from source: unknown 7557 1726882117.96056: variable 'ansible_connection' from source: unknown 7557 1726882117.96059: variable 'ansible_module_compression' from source: unknown 7557 1726882117.96061: variable 'ansible_shell_type' from source: unknown 7557 1726882117.96065: variable 'ansible_shell_executable' from source: unknown 7557 1726882117.96067: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882117.96070: variable 'ansible_pipelining' from source: unknown 7557 1726882117.96072: variable 'ansible_timeout' from source: unknown 7557 1726882117.96074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882117.96175: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882117.96185: variable 'omit' from source: magic vars 7557 1726882117.96188: starting attempt loop 7557 1726882117.96191: running the handler 7557 1726882117.96231: handler run complete 7557 1726882117.96241: attempt loop complete, returning result 7557 1726882117.96244: _execute() done 7557 1726882117.96247: dumping result to json 7557 1726882117.96249: done dumping result, returning 7557 1726882117.96256: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-ed48-b3a5-00000000010c] 7557 1726882117.96260: sending task result for task 12673a56-9f93-ed48-b3a5-00000000010c 7557 1726882117.96341: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000010c 7557 1726882117.96344: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7557 1726882117.96421: no more pending results, returning what we have 7557 1726882117.96425: results queue empty 7557 1726882117.96426: checking for any_errors_fatal 7557 1726882117.96437: done checking for any_errors_fatal 7557 1726882117.96437: checking for max_fail_percentage 7557 1726882117.96439: done checking for max_fail_percentage 7557 1726882117.96440: checking to see if all hosts have failed and the running result is not ok 7557 1726882117.96441: done checking to see if all hosts have failed 7557 1726882117.96441: getting the remaining hosts for this loop 7557 1726882117.96443: done getting the remaining hosts for this loop 7557 1726882117.96446: getting the next task for host managed_node3 7557 1726882117.96453: done getting next task for host managed_node3 7557 1726882117.96457: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882117.96460: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882117.96478: getting variables 7557 1726882117.96480: in VariableManager get_vars() 7557 1726882117.96537: Calling all_inventory to load vars for managed_node3 7557 1726882117.96540: Calling groups_inventory to load vars for managed_node3 7557 1726882117.96545: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882117.96555: Calling all_plugins_play to load vars for managed_node3 7557 1726882117.96558: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882117.96560: Calling groups_plugins_play to load vars for managed_node3 7557 1726882117.97370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882117.98379: done with get_vars() 7557 1726882117.98407: done getting variables 7557 1726882117.98466: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:28:37 -0400 (0:00:00.038) 0:00:43.837 ****** 7557 1726882117.98504: entering _queue_task() for managed_node3/fail 7557 1726882117.98823: worker is 1 (out of 1 available) 7557 1726882117.98835: exiting _queue_task() for managed_node3/fail 7557 1726882117.98848: done queuing things up, now waiting for results queue to drain 7557 1726882117.98849: waiting for pending results... 7557 1726882117.99224: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7557 1726882117.99290: in run() - task 12673a56-9f93-ed48-b3a5-00000000010d 7557 1726882117.99500: variable 'ansible_search_path' from source: unknown 7557 1726882117.99503: variable 'ansible_search_path' from source: unknown 7557 1726882117.99506: calling self._execute() 7557 1726882117.99509: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882117.99511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882117.99514: variable 'omit' from source: magic vars 7557 1726882117.99868: variable 'ansible_distribution_major_version' from source: facts 7557 1726882117.99886: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.00017: variable 'network_state' from source: role '' defaults 7557 1726882118.00036: Evaluated conditional (network_state != {}): False 7557 1726882118.00045: when evaluation is False, skipping this task 7557 1726882118.00054: _execute() done 7557 1726882118.00066: dumping result to json 7557 1726882118.00081: done dumping result, returning 7557 1726882118.00088: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-ed48-b3a5-00000000010d] 7557 1726882118.00099: sending task result for task 12673a56-9f93-ed48-b3a5-00000000010d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882118.00247: no more pending results, returning what we have 7557 1726882118.00251: results queue empty 7557 1726882118.00253: checking for any_errors_fatal 7557 1726882118.00263: done checking for any_errors_fatal 7557 1726882118.00263: checking for max_fail_percentage 7557 1726882118.00265: done checking for max_fail_percentage 7557 1726882118.00266: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.00267: done checking to see if all hosts have failed 7557 1726882118.00267: getting the remaining hosts for this loop 7557 1726882118.00269: done getting the remaining hosts for this loop 7557 1726882118.00272: getting the next task for host managed_node3 7557 1726882118.00278: done getting next task for host managed_node3 7557 1726882118.00282: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882118.00285: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.00313: getting variables 7557 1726882118.00315: in VariableManager get_vars() 7557 1726882118.00360: Calling all_inventory to load vars for managed_node3 7557 1726882118.00363: Calling groups_inventory to load vars for managed_node3 7557 1726882118.00365: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.00375: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.00377: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.00379: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.00908: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000010d 7557 1726882118.00911: WORKER PROCESS EXITING 7557 1726882118.01289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.07743: done with get_vars() 7557 1726882118.07768: done getting variables 7557 1726882118.07824: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:28:38 -0400 (0:00:00.093) 0:00:43.931 ****** 7557 1726882118.07852: entering _queue_task() for managed_node3/fail 7557 1726882118.08200: worker is 1 (out of 1 available) 7557 1726882118.08213: exiting _queue_task() for managed_node3/fail 7557 1726882118.08224: done queuing things up, now waiting for results queue to drain 7557 1726882118.08226: waiting for pending results... 7557 1726882118.08617: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7557 1726882118.08710: in run() - task 12673a56-9f93-ed48-b3a5-00000000010e 7557 1726882118.08733: variable 'ansible_search_path' from source: unknown 7557 1726882118.08741: variable 'ansible_search_path' from source: unknown 7557 1726882118.08783: calling self._execute() 7557 1726882118.08897: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.08936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.08939: variable 'omit' from source: magic vars 7557 1726882118.09349: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.09482: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.09507: variable 'network_state' from source: role '' defaults 7557 1726882118.09525: Evaluated conditional (network_state != {}): False 7557 1726882118.09534: when evaluation is False, skipping this task 7557 1726882118.09542: _execute() done 7557 1726882118.09549: dumping result to json 7557 1726882118.09557: done dumping result, returning 7557 1726882118.09568: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-ed48-b3a5-00000000010e] 7557 1726882118.09579: sending task result for task 12673a56-9f93-ed48-b3a5-00000000010e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882118.09743: no more pending results, returning what we have 7557 1726882118.09748: results queue empty 7557 1726882118.09749: checking for any_errors_fatal 7557 1726882118.09757: done checking for any_errors_fatal 7557 1726882118.09758: checking for max_fail_percentage 7557 1726882118.09760: done checking for max_fail_percentage 7557 1726882118.09761: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.09762: done checking to see if all hosts have failed 7557 1726882118.09763: getting the remaining hosts for this loop 7557 1726882118.09764: done getting the remaining hosts for this loop 7557 1726882118.09768: getting the next task for host managed_node3 7557 1726882118.09775: done getting next task for host managed_node3 7557 1726882118.09780: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882118.09784: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.09815: getting variables 7557 1726882118.09817: in VariableManager get_vars() 7557 1726882118.09869: Calling all_inventory to load vars for managed_node3 7557 1726882118.09872: Calling groups_inventory to load vars for managed_node3 7557 1726882118.09875: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.09887: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.09890: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.10001: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000010e 7557 1726882118.10004: WORKER PROCESS EXITING 7557 1726882118.10009: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.11519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.13091: done with get_vars() 7557 1726882118.13122: done getting variables 7557 1726882118.13182: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:28:38 -0400 (0:00:00.053) 0:00:43.985 ****** 7557 1726882118.13223: entering _queue_task() for managed_node3/fail 7557 1726882118.13551: worker is 1 (out of 1 available) 7557 1726882118.13564: exiting _queue_task() for managed_node3/fail 7557 1726882118.13577: done queuing things up, now waiting for results queue to drain 7557 1726882118.13579: waiting for pending results... 7557 1726882118.13879: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7557 1726882118.14034: in run() - task 12673a56-9f93-ed48-b3a5-00000000010f 7557 1726882118.14055: variable 'ansible_search_path' from source: unknown 7557 1726882118.14064: variable 'ansible_search_path' from source: unknown 7557 1726882118.14110: calling self._execute() 7557 1726882118.14224: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.14242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.14258: variable 'omit' from source: magic vars 7557 1726882118.14662: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.14685: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.14920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882118.17206: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882118.17515: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882118.17543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882118.17588: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882118.17606: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882118.17664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.17800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.17804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.17807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.17809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.17883: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.17909: Evaluated conditional (ansible_distribution_major_version | int > 9): True 7557 1726882118.18039: variable 'ansible_distribution' from source: facts 7557 1726882118.18053: variable '__network_rh_distros' from source: role '' defaults 7557 1726882118.18069: Evaluated conditional (ansible_distribution in __network_rh_distros): True 7557 1726882118.18328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.18356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.18383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.18431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.18450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.18502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.18530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.18558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.18604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.18624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.18667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.18800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.18803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.18806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.18811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.19066: variable 'network_connections' from source: task vars 7557 1726882118.19082: variable 'interface' from source: play vars 7557 1726882118.19154: variable 'interface' from source: play vars 7557 1726882118.19169: variable 'network_state' from source: role '' defaults 7557 1726882118.19248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882118.19420: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882118.19466: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882118.19504: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882118.19553: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882118.19607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882118.19631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882118.19668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.19704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882118.19731: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 7557 1726882118.19744: when evaluation is False, skipping this task 7557 1726882118.19751: _execute() done 7557 1726882118.19790: dumping result to json 7557 1726882118.19797: done dumping result, returning 7557 1726882118.19800: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-ed48-b3a5-00000000010f] 7557 1726882118.19802: sending task result for task 12673a56-9f93-ed48-b3a5-00000000010f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 7557 1726882118.19953: no more pending results, returning what we have 7557 1726882118.19957: results queue empty 7557 1726882118.19959: checking for any_errors_fatal 7557 1726882118.19963: done checking for any_errors_fatal 7557 1726882118.19964: checking for max_fail_percentage 7557 1726882118.19966: done checking for max_fail_percentage 7557 1726882118.19967: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.19968: done checking to see if all hosts have failed 7557 1726882118.19968: getting the remaining hosts for this loop 7557 1726882118.19970: done getting the remaining hosts for this loop 7557 1726882118.19974: getting the next task for host managed_node3 7557 1726882118.19982: done getting next task for host managed_node3 7557 1726882118.19986: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882118.19989: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.20015: getting variables 7557 1726882118.20017: in VariableManager get_vars() 7557 1726882118.20071: Calling all_inventory to load vars for managed_node3 7557 1726882118.20074: Calling groups_inventory to load vars for managed_node3 7557 1726882118.20077: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.20087: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.20090: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.20299: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.21008: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000010f 7557 1726882118.21012: WORKER PROCESS EXITING 7557 1726882118.21967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.23835: done with get_vars() 7557 1726882118.23857: done getting variables 7557 1726882118.23925: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:28:38 -0400 (0:00:00.107) 0:00:44.092 ****** 7557 1726882118.23959: entering _queue_task() for managed_node3/dnf 7557 1726882118.24281: worker is 1 (out of 1 available) 7557 1726882118.24298: exiting _queue_task() for managed_node3/dnf 7557 1726882118.24311: done queuing things up, now waiting for results queue to drain 7557 1726882118.24313: waiting for pending results... 7557 1726882118.24737: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7557 1726882118.24924: in run() - task 12673a56-9f93-ed48-b3a5-000000000110 7557 1726882118.24945: variable 'ansible_search_path' from source: unknown 7557 1726882118.24978: variable 'ansible_search_path' from source: unknown 7557 1726882118.25034: calling self._execute() 7557 1726882118.25185: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.25206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.25225: variable 'omit' from source: magic vars 7557 1726882118.25664: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.25683: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.25910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882118.30085: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882118.30301: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882118.30314: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882118.30533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882118.30591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882118.30737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.30786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.30823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.30872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.30890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.31017: variable 'ansible_distribution' from source: facts 7557 1726882118.31028: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.31054: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7557 1726882118.31196: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882118.31327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.31409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.31412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.31437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.31455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.31502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.31535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.31562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.31610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.31636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.31689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.31711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.31800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.31803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.31805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.31964: variable 'network_connections' from source: task vars 7557 1726882118.31981: variable 'interface' from source: play vars 7557 1726882118.32051: variable 'interface' from source: play vars 7557 1726882118.32134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882118.32325: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882118.32406: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882118.32455: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882118.32581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882118.32584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882118.32587: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882118.32599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.32632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882118.32696: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882118.32950: variable 'network_connections' from source: task vars 7557 1726882118.32960: variable 'interface' from source: play vars 7557 1726882118.33026: variable 'interface' from source: play vars 7557 1726882118.33111: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882118.33115: when evaluation is False, skipping this task 7557 1726882118.33117: _execute() done 7557 1726882118.33121: dumping result to json 7557 1726882118.33125: done dumping result, returning 7557 1726882118.33127: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-000000000110] 7557 1726882118.33129: sending task result for task 12673a56-9f93-ed48-b3a5-000000000110 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882118.33280: no more pending results, returning what we have 7557 1726882118.33284: results queue empty 7557 1726882118.33285: checking for any_errors_fatal 7557 1726882118.33295: done checking for any_errors_fatal 7557 1726882118.33296: checking for max_fail_percentage 7557 1726882118.33298: done checking for max_fail_percentage 7557 1726882118.33299: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.33299: done checking to see if all hosts have failed 7557 1726882118.33300: getting the remaining hosts for this loop 7557 1726882118.33302: done getting the remaining hosts for this loop 7557 1726882118.33305: getting the next task for host managed_node3 7557 1726882118.33313: done getting next task for host managed_node3 7557 1726882118.33317: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882118.33320: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.33342: getting variables 7557 1726882118.33344: in VariableManager get_vars() 7557 1726882118.33499: Calling all_inventory to load vars for managed_node3 7557 1726882118.33503: Calling groups_inventory to load vars for managed_node3 7557 1726882118.33506: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.33616: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.33620: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.33624: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.34237: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000110 7557 1726882118.34240: WORKER PROCESS EXITING 7557 1726882118.35286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.36942: done with get_vars() 7557 1726882118.36975: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7557 1726882118.37058: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:28:38 -0400 (0:00:00.131) 0:00:44.223 ****** 7557 1726882118.37102: entering _queue_task() for managed_node3/yum 7557 1726882118.37446: worker is 1 (out of 1 available) 7557 1726882118.37459: exiting _queue_task() for managed_node3/yum 7557 1726882118.37471: done queuing things up, now waiting for results queue to drain 7557 1726882118.37472: waiting for pending results... 7557 1726882118.37780: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7557 1726882118.37923: in run() - task 12673a56-9f93-ed48-b3a5-000000000111 7557 1726882118.37935: variable 'ansible_search_path' from source: unknown 7557 1726882118.37939: variable 'ansible_search_path' from source: unknown 7557 1726882118.37984: calling self._execute() 7557 1726882118.38101: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.38108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.38121: variable 'omit' from source: magic vars 7557 1726882118.38532: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.38545: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.38732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882118.41300: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882118.41400: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882118.41404: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882118.41438: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882118.41463: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882118.41544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.41571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.41601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.41643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.41655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.41753: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.41766: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7557 1726882118.41769: when evaluation is False, skipping this task 7557 1726882118.41772: _execute() done 7557 1726882118.41775: dumping result to json 7557 1726882118.41777: done dumping result, returning 7557 1726882118.41786: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-000000000111] 7557 1726882118.41791: sending task result for task 12673a56-9f93-ed48-b3a5-000000000111 7557 1726882118.41900: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000111 7557 1726882118.41903: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7557 1726882118.41960: no more pending results, returning what we have 7557 1726882118.41964: results queue empty 7557 1726882118.41965: checking for any_errors_fatal 7557 1726882118.41973: done checking for any_errors_fatal 7557 1726882118.41973: checking for max_fail_percentage 7557 1726882118.41976: done checking for max_fail_percentage 7557 1726882118.41977: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.41978: done checking to see if all hosts have failed 7557 1726882118.41979: getting the remaining hosts for this loop 7557 1726882118.41981: done getting the remaining hosts for this loop 7557 1726882118.41984: getting the next task for host managed_node3 7557 1726882118.41991: done getting next task for host managed_node3 7557 1726882118.41999: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882118.42002: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.42026: getting variables 7557 1726882118.42027: in VariableManager get_vars() 7557 1726882118.42080: Calling all_inventory to load vars for managed_node3 7557 1726882118.42083: Calling groups_inventory to load vars for managed_node3 7557 1726882118.42086: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.42304: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.42308: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.42312: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.43828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.45361: done with get_vars() 7557 1726882118.45387: done getting variables 7557 1726882118.45453: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:28:38 -0400 (0:00:00.083) 0:00:44.307 ****** 7557 1726882118.45490: entering _queue_task() for managed_node3/fail 7557 1726882118.45845: worker is 1 (out of 1 available) 7557 1726882118.45859: exiting _queue_task() for managed_node3/fail 7557 1726882118.45872: done queuing things up, now waiting for results queue to drain 7557 1726882118.45873: waiting for pending results... 7557 1726882118.46222: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7557 1726882118.46340: in run() - task 12673a56-9f93-ed48-b3a5-000000000112 7557 1726882118.46360: variable 'ansible_search_path' from source: unknown 7557 1726882118.46500: variable 'ansible_search_path' from source: unknown 7557 1726882118.46505: calling self._execute() 7557 1726882118.46534: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.46549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.46565: variable 'omit' from source: magic vars 7557 1726882118.46979: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.47001: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.47128: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882118.47342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882118.49547: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882118.49638: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882118.49683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882118.49726: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882118.49757: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882118.49846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.49888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.50099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.50103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.50107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.50109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.50112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.50115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.50129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.50150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.50196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.50228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.50258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.50303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.50323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.50513: variable 'network_connections' from source: task vars 7557 1726882118.50531: variable 'interface' from source: play vars 7557 1726882118.50606: variable 'interface' from source: play vars 7557 1726882118.50686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882118.50858: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882118.50919: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882118.50953: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882118.50987: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882118.51031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882118.51052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882118.51076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.51113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882118.51207: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882118.51428: variable 'network_connections' from source: task vars 7557 1726882118.51437: variable 'interface' from source: play vars 7557 1726882118.51502: variable 'interface' from source: play vars 7557 1726882118.51539: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882118.51547: when evaluation is False, skipping this task 7557 1726882118.51553: _execute() done 7557 1726882118.51639: dumping result to json 7557 1726882118.51642: done dumping result, returning 7557 1726882118.51644: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-000000000112] 7557 1726882118.51646: sending task result for task 12673a56-9f93-ed48-b3a5-000000000112 7557 1726882118.51731: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000112 7557 1726882118.51734: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882118.51788: no more pending results, returning what we have 7557 1726882118.51796: results queue empty 7557 1726882118.51797: checking for any_errors_fatal 7557 1726882118.51803: done checking for any_errors_fatal 7557 1726882118.51804: checking for max_fail_percentage 7557 1726882118.51806: done checking for max_fail_percentage 7557 1726882118.51807: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.51808: done checking to see if all hosts have failed 7557 1726882118.51809: getting the remaining hosts for this loop 7557 1726882118.51811: done getting the remaining hosts for this loop 7557 1726882118.51814: getting the next task for host managed_node3 7557 1726882118.51821: done getting next task for host managed_node3 7557 1726882118.51825: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7557 1726882118.51828: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.51850: getting variables 7557 1726882118.51852: in VariableManager get_vars() 7557 1726882118.51913: Calling all_inventory to load vars for managed_node3 7557 1726882118.51916: Calling groups_inventory to load vars for managed_node3 7557 1726882118.51918: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.51929: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.51932: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.51935: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.53647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.55481: done with get_vars() 7557 1726882118.55510: done getting variables 7557 1726882118.55573: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:28:38 -0400 (0:00:00.101) 0:00:44.409 ****** 7557 1726882118.55620: entering _queue_task() for managed_node3/package 7557 1726882118.55990: worker is 1 (out of 1 available) 7557 1726882118.56143: exiting _queue_task() for managed_node3/package 7557 1726882118.56155: done queuing things up, now waiting for results queue to drain 7557 1726882118.56157: waiting for pending results... 7557 1726882118.56399: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7557 1726882118.56573: in run() - task 12673a56-9f93-ed48-b3a5-000000000113 7557 1726882118.56577: variable 'ansible_search_path' from source: unknown 7557 1726882118.56580: variable 'ansible_search_path' from source: unknown 7557 1726882118.56606: calling self._execute() 7557 1726882118.56728: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.56741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.56758: variable 'omit' from source: magic vars 7557 1726882118.57200: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.57204: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.57422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882118.57703: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882118.57770: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882118.57790: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882118.57867: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882118.58199: variable 'network_packages' from source: role '' defaults 7557 1726882118.58203: variable '__network_provider_setup' from source: role '' defaults 7557 1726882118.58205: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882118.58208: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882118.58209: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882118.58256: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882118.58461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882118.60512: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882118.60573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882118.60624: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882118.60662: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882118.60690: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882118.60795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.60838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.60871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.60920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.60943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.60996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.61025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.61165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.61168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.61170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.61369: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882118.61505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.61534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.61562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.61614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.61634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.61735: variable 'ansible_python' from source: facts 7557 1726882118.61767: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882118.61860: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882118.61952: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882118.62086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.62119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.62154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.62199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.62218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.62272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.62363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.62366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.62384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.62407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.62561: variable 'network_connections' from source: task vars 7557 1726882118.62576: variable 'interface' from source: play vars 7557 1726882118.62686: variable 'interface' from source: play vars 7557 1726882118.62761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882118.62998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882118.63003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.63005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882118.63007: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882118.63208: variable 'network_connections' from source: task vars 7557 1726882118.63218: variable 'interface' from source: play vars 7557 1726882118.63327: variable 'interface' from source: play vars 7557 1726882118.63372: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882118.63462: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882118.63758: variable 'network_connections' from source: task vars 7557 1726882118.63774: variable 'interface' from source: play vars 7557 1726882118.63848: variable 'interface' from source: play vars 7557 1726882118.63877: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882118.63972: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882118.64318: variable 'network_connections' from source: task vars 7557 1726882118.64337: variable 'interface' from source: play vars 7557 1726882118.64406: variable 'interface' from source: play vars 7557 1726882118.64472: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882118.64546: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882118.64559: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882118.64625: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882118.64874: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882118.65621: variable 'network_connections' from source: task vars 7557 1726882118.65680: variable 'interface' from source: play vars 7557 1726882118.65708: variable 'interface' from source: play vars 7557 1726882118.65722: variable 'ansible_distribution' from source: facts 7557 1726882118.65731: variable '__network_rh_distros' from source: role '' defaults 7557 1726882118.65742: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.65761: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882118.65946: variable 'ansible_distribution' from source: facts 7557 1726882118.66002: variable '__network_rh_distros' from source: role '' defaults 7557 1726882118.66011: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.66014: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882118.66170: variable 'ansible_distribution' from source: facts 7557 1726882118.66179: variable '__network_rh_distros' from source: role '' defaults 7557 1726882118.66188: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.66246: variable 'network_provider' from source: set_fact 7557 1726882118.66269: variable 'ansible_facts' from source: unknown 7557 1726882118.67063: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7557 1726882118.67098: when evaluation is False, skipping this task 7557 1726882118.67101: _execute() done 7557 1726882118.67104: dumping result to json 7557 1726882118.67106: done dumping result, returning 7557 1726882118.67205: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-ed48-b3a5-000000000113] 7557 1726882118.67208: sending task result for task 12673a56-9f93-ed48-b3a5-000000000113 7557 1726882118.67281: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000113 7557 1726882118.67286: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7557 1726882118.67353: no more pending results, returning what we have 7557 1726882118.67358: results queue empty 7557 1726882118.67359: checking for any_errors_fatal 7557 1726882118.67367: done checking for any_errors_fatal 7557 1726882118.67368: checking for max_fail_percentage 7557 1726882118.67370: done checking for max_fail_percentage 7557 1726882118.67371: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.67372: done checking to see if all hosts have failed 7557 1726882118.67373: getting the remaining hosts for this loop 7557 1726882118.67374: done getting the remaining hosts for this loop 7557 1726882118.67378: getting the next task for host managed_node3 7557 1726882118.67384: done getting next task for host managed_node3 7557 1726882118.67388: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882118.67391: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.67420: getting variables 7557 1726882118.67422: in VariableManager get_vars() 7557 1726882118.67476: Calling all_inventory to load vars for managed_node3 7557 1726882118.67479: Calling groups_inventory to load vars for managed_node3 7557 1726882118.67482: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.67709: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.67713: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.67718: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.70671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.72811: done with get_vars() 7557 1726882118.72869: done getting variables 7557 1726882118.72935: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:28:38 -0400 (0:00:00.173) 0:00:44.582 ****** 7557 1726882118.72978: entering _queue_task() for managed_node3/package 7557 1726882118.73237: worker is 1 (out of 1 available) 7557 1726882118.73250: exiting _queue_task() for managed_node3/package 7557 1726882118.73262: done queuing things up, now waiting for results queue to drain 7557 1726882118.73264: waiting for pending results... 7557 1726882118.73457: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7557 1726882118.73568: in run() - task 12673a56-9f93-ed48-b3a5-000000000114 7557 1726882118.73579: variable 'ansible_search_path' from source: unknown 7557 1726882118.73583: variable 'ansible_search_path' from source: unknown 7557 1726882118.73800: calling self._execute() 7557 1726882118.73804: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.73806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.73809: variable 'omit' from source: magic vars 7557 1726882118.74148: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.74168: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.74290: variable 'network_state' from source: role '' defaults 7557 1726882118.74311: Evaluated conditional (network_state != {}): False 7557 1726882118.74319: when evaluation is False, skipping this task 7557 1726882118.74326: _execute() done 7557 1726882118.74332: dumping result to json 7557 1726882118.74340: done dumping result, returning 7557 1726882118.74351: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-ed48-b3a5-000000000114] 7557 1726882118.74361: sending task result for task 12673a56-9f93-ed48-b3a5-000000000114 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882118.74523: no more pending results, returning what we have 7557 1726882118.74528: results queue empty 7557 1726882118.74529: checking for any_errors_fatal 7557 1726882118.74536: done checking for any_errors_fatal 7557 1726882118.74536: checking for max_fail_percentage 7557 1726882118.74538: done checking for max_fail_percentage 7557 1726882118.74539: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.74540: done checking to see if all hosts have failed 7557 1726882118.74541: getting the remaining hosts for this loop 7557 1726882118.74542: done getting the remaining hosts for this loop 7557 1726882118.74546: getting the next task for host managed_node3 7557 1726882118.74553: done getting next task for host managed_node3 7557 1726882118.74557: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882118.74561: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.74586: getting variables 7557 1726882118.74588: in VariableManager get_vars() 7557 1726882118.74648: Calling all_inventory to load vars for managed_node3 7557 1726882118.74652: Calling groups_inventory to load vars for managed_node3 7557 1726882118.74655: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.74670: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.74675: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.74680: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.75707: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000114 7557 1726882118.75711: WORKER PROCESS EXITING 7557 1726882118.75763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.76632: done with get_vars() 7557 1726882118.76654: done getting variables 7557 1726882118.76702: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:28:38 -0400 (0:00:00.037) 0:00:44.620 ****** 7557 1726882118.76728: entering _queue_task() for managed_node3/package 7557 1726882118.77087: worker is 1 (out of 1 available) 7557 1726882118.77306: exiting _queue_task() for managed_node3/package 7557 1726882118.77317: done queuing things up, now waiting for results queue to drain 7557 1726882118.77318: waiting for pending results... 7557 1726882118.77447: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7557 1726882118.77587: in run() - task 12673a56-9f93-ed48-b3a5-000000000115 7557 1726882118.77653: variable 'ansible_search_path' from source: unknown 7557 1726882118.77656: variable 'ansible_search_path' from source: unknown 7557 1726882118.77669: calling self._execute() 7557 1726882118.77786: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.77802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.77817: variable 'omit' from source: magic vars 7557 1726882118.78411: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.78425: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.78531: variable 'network_state' from source: role '' defaults 7557 1726882118.78540: Evaluated conditional (network_state != {}): False 7557 1726882118.78543: when evaluation is False, skipping this task 7557 1726882118.78546: _execute() done 7557 1726882118.78549: dumping result to json 7557 1726882118.78551: done dumping result, returning 7557 1726882118.78561: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-ed48-b3a5-000000000115] 7557 1726882118.78564: sending task result for task 12673a56-9f93-ed48-b3a5-000000000115 7557 1726882118.78672: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000115 7557 1726882118.78674: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882118.78727: no more pending results, returning what we have 7557 1726882118.78731: results queue empty 7557 1726882118.78732: checking for any_errors_fatal 7557 1726882118.78741: done checking for any_errors_fatal 7557 1726882118.78742: checking for max_fail_percentage 7557 1726882118.78744: done checking for max_fail_percentage 7557 1726882118.78744: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.78745: done checking to see if all hosts have failed 7557 1726882118.78746: getting the remaining hosts for this loop 7557 1726882118.78747: done getting the remaining hosts for this loop 7557 1726882118.78751: getting the next task for host managed_node3 7557 1726882118.78757: done getting next task for host managed_node3 7557 1726882118.78761: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882118.78763: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.78795: getting variables 7557 1726882118.78798: in VariableManager get_vars() 7557 1726882118.78841: Calling all_inventory to load vars for managed_node3 7557 1726882118.78844: Calling groups_inventory to load vars for managed_node3 7557 1726882118.78846: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.78856: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.78859: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.78861: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.79643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.80937: done with get_vars() 7557 1726882118.80961: done getting variables 7557 1726882118.81026: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:28:38 -0400 (0:00:00.043) 0:00:44.663 ****** 7557 1726882118.81060: entering _queue_task() for managed_node3/service 7557 1726882118.81395: worker is 1 (out of 1 available) 7557 1726882118.81408: exiting _queue_task() for managed_node3/service 7557 1726882118.81423: done queuing things up, now waiting for results queue to drain 7557 1726882118.81424: waiting for pending results... 7557 1726882118.81868: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7557 1726882118.81942: in run() - task 12673a56-9f93-ed48-b3a5-000000000116 7557 1726882118.81964: variable 'ansible_search_path' from source: unknown 7557 1726882118.81973: variable 'ansible_search_path' from source: unknown 7557 1726882118.82022: calling self._execute() 7557 1726882118.82198: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.82454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.82458: variable 'omit' from source: magic vars 7557 1726882118.83006: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.83200: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.83399: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882118.83611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882118.85860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882118.85934: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882118.85984: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882118.86037: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882118.86066: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882118.86154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.86189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.86226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.86271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.86297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.86346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.86375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.86414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.86466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.86490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.86543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.86573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.86609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.86650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.86668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.86857: variable 'network_connections' from source: task vars 7557 1726882118.86873: variable 'interface' from source: play vars 7557 1726882118.86951: variable 'interface' from source: play vars 7557 1726882118.87034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882118.87231: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882118.87250: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882118.87283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882118.87330: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882118.87379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882118.87447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882118.87450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.87469: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882118.87525: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882118.87782: variable 'network_connections' from source: task vars 7557 1726882118.87796: variable 'interface' from source: play vars 7557 1726882118.87862: variable 'interface' from source: play vars 7557 1726882118.87987: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7557 1726882118.87991: when evaluation is False, skipping this task 7557 1726882118.87997: _execute() done 7557 1726882118.88000: dumping result to json 7557 1726882118.88002: done dumping result, returning 7557 1726882118.88005: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-ed48-b3a5-000000000116] 7557 1726882118.88007: sending task result for task 12673a56-9f93-ed48-b3a5-000000000116 7557 1726882118.88087: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000116 7557 1726882118.88101: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7557 1726882118.88153: no more pending results, returning what we have 7557 1726882118.88157: results queue empty 7557 1726882118.88159: checking for any_errors_fatal 7557 1726882118.88166: done checking for any_errors_fatal 7557 1726882118.88167: checking for max_fail_percentage 7557 1726882118.88169: done checking for max_fail_percentage 7557 1726882118.88170: checking to see if all hosts have failed and the running result is not ok 7557 1726882118.88171: done checking to see if all hosts have failed 7557 1726882118.88172: getting the remaining hosts for this loop 7557 1726882118.88174: done getting the remaining hosts for this loop 7557 1726882118.88178: getting the next task for host managed_node3 7557 1726882118.88186: done getting next task for host managed_node3 7557 1726882118.88191: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882118.88198: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882118.88223: getting variables 7557 1726882118.88225: in VariableManager get_vars() 7557 1726882118.88283: Calling all_inventory to load vars for managed_node3 7557 1726882118.88288: Calling groups_inventory to load vars for managed_node3 7557 1726882118.88291: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882118.88514: Calling all_plugins_play to load vars for managed_node3 7557 1726882118.88518: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882118.88521: Calling groups_plugins_play to load vars for managed_node3 7557 1726882118.90214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882118.91661: done with get_vars() 7557 1726882118.91690: done getting variables 7557 1726882118.91756: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:28:38 -0400 (0:00:00.107) 0:00:44.770 ****** 7557 1726882118.91796: entering _queue_task() for managed_node3/service 7557 1726882118.92153: worker is 1 (out of 1 available) 7557 1726882118.92167: exiting _queue_task() for managed_node3/service 7557 1726882118.92181: done queuing things up, now waiting for results queue to drain 7557 1726882118.92183: waiting for pending results... 7557 1726882118.92617: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7557 1726882118.92657: in run() - task 12673a56-9f93-ed48-b3a5-000000000117 7557 1726882118.92712: variable 'ansible_search_path' from source: unknown 7557 1726882118.92716: variable 'ansible_search_path' from source: unknown 7557 1726882118.92743: calling self._execute() 7557 1726882118.92874: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882118.92930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882118.92935: variable 'omit' from source: magic vars 7557 1726882118.93327: variable 'ansible_distribution_major_version' from source: facts 7557 1726882118.93345: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882118.93527: variable 'network_provider' from source: set_fact 7557 1726882118.93538: variable 'network_state' from source: role '' defaults 7557 1726882118.93552: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7557 1726882118.93581: variable 'omit' from source: magic vars 7557 1726882118.93628: variable 'omit' from source: magic vars 7557 1726882118.93662: variable 'network_service_name' from source: role '' defaults 7557 1726882118.93801: variable 'network_service_name' from source: role '' defaults 7557 1726882118.93851: variable '__network_provider_setup' from source: role '' defaults 7557 1726882118.93862: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882118.93936: variable '__network_service_name_default_nm' from source: role '' defaults 7557 1726882118.93950: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882118.94021: variable '__network_packages_default_nm' from source: role '' defaults 7557 1726882118.94254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882118.96603: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882118.96606: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882118.96609: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882118.96611: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882118.96629: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882118.96717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.96749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.96779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.96833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.96854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.96905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.96937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.96963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.97010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.97037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.97274: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7557 1726882118.97401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.97429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.97458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.97512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.97532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.97634: variable 'ansible_python' from source: facts 7557 1726882118.97660: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7557 1726882118.97750: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882118.97836: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882118.97965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.98001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.98033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.98076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.98100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.98152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882118.98182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882118.98213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.98257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882118.98274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882118.98404: variable 'network_connections' from source: task vars 7557 1726882118.98416: variable 'interface' from source: play vars 7557 1726882118.98553: variable 'interface' from source: play vars 7557 1726882118.98668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882118.98811: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882118.98859: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882118.98913: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882118.98958: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882118.99026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882118.99060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882118.99102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882118.99141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882118.99191: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882118.99500: variable 'network_connections' from source: task vars 7557 1726882118.99503: variable 'interface' from source: play vars 7557 1726882118.99550: variable 'interface' from source: play vars 7557 1726882118.99586: variable '__network_packages_default_wireless' from source: role '' defaults 7557 1726882118.99675: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882118.99982: variable 'network_connections' from source: task vars 7557 1726882118.99992: variable 'interface' from source: play vars 7557 1726882119.00066: variable 'interface' from source: play vars 7557 1726882119.00187: variable '__network_packages_default_team' from source: role '' defaults 7557 1726882119.00190: variable '__network_team_connections_defined' from source: role '' defaults 7557 1726882119.00481: variable 'network_connections' from source: task vars 7557 1726882119.00497: variable 'interface' from source: play vars 7557 1726882119.00573: variable 'interface' from source: play vars 7557 1726882119.00640: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882119.00710: variable '__network_service_name_default_initscripts' from source: role '' defaults 7557 1726882119.00723: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882119.00789: variable '__network_packages_default_initscripts' from source: role '' defaults 7557 1726882119.01031: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7557 1726882119.01601: variable 'network_connections' from source: task vars 7557 1726882119.01604: variable 'interface' from source: play vars 7557 1726882119.01612: variable 'interface' from source: play vars 7557 1726882119.01625: variable 'ansible_distribution' from source: facts 7557 1726882119.01633: variable '__network_rh_distros' from source: role '' defaults 7557 1726882119.01643: variable 'ansible_distribution_major_version' from source: facts 7557 1726882119.01659: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7557 1726882119.01828: variable 'ansible_distribution' from source: facts 7557 1726882119.01836: variable '__network_rh_distros' from source: role '' defaults 7557 1726882119.01845: variable 'ansible_distribution_major_version' from source: facts 7557 1726882119.01861: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7557 1726882119.02038: variable 'ansible_distribution' from source: facts 7557 1726882119.02049: variable '__network_rh_distros' from source: role '' defaults 7557 1726882119.02060: variable 'ansible_distribution_major_version' from source: facts 7557 1726882119.02144: variable 'network_provider' from source: set_fact 7557 1726882119.02147: variable 'omit' from source: magic vars 7557 1726882119.02172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882119.02213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882119.02238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882119.02266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882119.02281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882119.02317: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882119.02361: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882119.02366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882119.02447: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882119.02460: Set connection var ansible_shell_executable to /bin/sh 7557 1726882119.02471: Set connection var ansible_shell_type to sh 7557 1726882119.02481: Set connection var ansible_pipelining to False 7557 1726882119.02487: Set connection var ansible_connection to ssh 7557 1726882119.02599: Set connection var ansible_timeout to 10 7557 1726882119.02603: variable 'ansible_shell_executable' from source: unknown 7557 1726882119.02605: variable 'ansible_connection' from source: unknown 7557 1726882119.02607: variable 'ansible_module_compression' from source: unknown 7557 1726882119.02609: variable 'ansible_shell_type' from source: unknown 7557 1726882119.02611: variable 'ansible_shell_executable' from source: unknown 7557 1726882119.02613: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882119.02615: variable 'ansible_pipelining' from source: unknown 7557 1726882119.02617: variable 'ansible_timeout' from source: unknown 7557 1726882119.02619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882119.02673: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882119.02689: variable 'omit' from source: magic vars 7557 1726882119.02709: starting attempt loop 7557 1726882119.02717: running the handler 7557 1726882119.02800: variable 'ansible_facts' from source: unknown 7557 1726882119.03566: _low_level_execute_command(): starting 7557 1726882119.03578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882119.04236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882119.04282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882119.04285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882119.04392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882119.04411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882119.04427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882119.04516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882119.06488: stdout chunk (state=3): >>>/root <<< 7557 1726882119.06495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882119.06498: stdout chunk (state=3): >>><<< 7557 1726882119.06501: stderr chunk (state=3): >>><<< 7557 1726882119.06503: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882119.06505: _low_level_execute_command(): starting 7557 1726882119.06508: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658 `" && echo ansible-tmp-1726882119.063987-9288-221059262342658="` echo /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658 `" ) && sleep 0' 7557 1726882119.07608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882119.07685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882119.07689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882119.07916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882119.07976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882119.09837: stdout chunk (state=3): >>>ansible-tmp-1726882119.063987-9288-221059262342658=/root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658 <<< 7557 1726882119.10001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882119.10005: stderr chunk (state=3): >>><<< 7557 1726882119.10007: stdout chunk (state=3): >>><<< 7557 1726882119.10100: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882119.063987-9288-221059262342658=/root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882119.10121: variable 'ansible_module_compression' from source: unknown 7557 1726882119.10226: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7557 1726882119.10700: variable 'ansible_facts' from source: unknown 7557 1726882119.10844: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/AnsiballZ_systemd.py 7557 1726882119.11182: Sending initial data 7557 1726882119.11196: Sent initial data (153 bytes) 7557 1726882119.12463: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882119.12679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882119.12699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882119.12771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882119.14272: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882119.14366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882119.14426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpy_6tzlsy /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/AnsiballZ_systemd.py <<< 7557 1726882119.14430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/AnsiballZ_systemd.py" <<< 7557 1726882119.14607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpy_6tzlsy" to remote "/root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/AnsiballZ_systemd.py" <<< 7557 1726882119.18204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882119.18245: stderr chunk (state=3): >>><<< 7557 1726882119.18249: stdout chunk (state=3): >>><<< 7557 1726882119.18308: done transferring module to remote 7557 1726882119.18374: _low_level_execute_command(): starting 7557 1726882119.18377: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/ /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/AnsiballZ_systemd.py && sleep 0' 7557 1726882119.19541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882119.19800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882119.19816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882119.19829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882119.19904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882119.21606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882119.21667: stderr chunk (state=3): >>><<< 7557 1726882119.21670: stdout chunk (state=3): >>><<< 7557 1726882119.21673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882119.21675: _low_level_execute_command(): starting 7557 1726882119.21682: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/AnsiballZ_systemd.py && sleep 0' 7557 1726882119.22263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882119.22300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882119.22304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882119.22306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882119.22309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882119.22322: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882119.22325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882119.22339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882119.22417: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882119.22428: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882119.22431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882119.22433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882119.22435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882119.22437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882119.22439: stderr chunk (state=3): >>>debug2: match found <<< 7557 1726882119.22441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882119.22472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882119.22491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882119.22529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882119.22585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882119.51048: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9535488", "MemoryPeak": "10067968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3329355776", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "205601000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service network.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7557 1726882119.52681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882119.52802: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 7557 1726882119.52806: stdout chunk (state=3): >>><<< 7557 1726882119.52809: stderr chunk (state=3): >>><<< 7557 1726882119.52812: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:22:06 EDT] ; stop_time=[n/a] ; pid=711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9535488", "MemoryPeak": "10067968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3329355776", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "205601000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service network.target multi-user.target", "After": "system.slice systemd-journald.socket dbus.socket sysinit.target network-pre.target cloud-init-local.service basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:22:07 EDT", "StateChangeTimestampMonotonic": "34618487", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882119.53015: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882119.53041: _low_level_execute_command(): starting 7557 1726882119.53051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882119.063987-9288-221059262342658/ > /dev/null 2>&1 && sleep 0' 7557 1726882119.54123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882119.54238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882119.54289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882119.56037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882119.56069: stderr chunk (state=3): >>><<< 7557 1726882119.56077: stdout chunk (state=3): >>><<< 7557 1726882119.56104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882119.56265: handler run complete 7557 1726882119.56268: attempt loop complete, returning result 7557 1726882119.56270: _execute() done 7557 1726882119.56272: dumping result to json 7557 1726882119.56274: done dumping result, returning 7557 1726882119.56375: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-ed48-b3a5-000000000117] 7557 1726882119.56385: sending task result for task 12673a56-9f93-ed48-b3a5-000000000117 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882119.57407: no more pending results, returning what we have 7557 1726882119.57411: results queue empty 7557 1726882119.57413: checking for any_errors_fatal 7557 1726882119.57419: done checking for any_errors_fatal 7557 1726882119.57420: checking for max_fail_percentage 7557 1726882119.57422: done checking for max_fail_percentage 7557 1726882119.57423: checking to see if all hosts have failed and the running result is not ok 7557 1726882119.57424: done checking to see if all hosts have failed 7557 1726882119.57424: getting the remaining hosts for this loop 7557 1726882119.57426: done getting the remaining hosts for this loop 7557 1726882119.57430: getting the next task for host managed_node3 7557 1726882119.57438: done getting next task for host managed_node3 7557 1726882119.57442: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882119.57446: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882119.57458: getting variables 7557 1726882119.57460: in VariableManager get_vars() 7557 1726882119.57510: Calling all_inventory to load vars for managed_node3 7557 1726882119.57513: Calling groups_inventory to load vars for managed_node3 7557 1726882119.57516: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882119.57527: Calling all_plugins_play to load vars for managed_node3 7557 1726882119.57530: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882119.57533: Calling groups_plugins_play to load vars for managed_node3 7557 1726882119.58302: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000117 7557 1726882119.58305: WORKER PROCESS EXITING 7557 1726882119.60353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882119.62209: done with get_vars() 7557 1726882119.62231: done getting variables 7557 1726882119.62301: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:28:39 -0400 (0:00:00.705) 0:00:45.476 ****** 7557 1726882119.62345: entering _queue_task() for managed_node3/service 7557 1726882119.62825: worker is 1 (out of 1 available) 7557 1726882119.62837: exiting _queue_task() for managed_node3/service 7557 1726882119.62849: done queuing things up, now waiting for results queue to drain 7557 1726882119.62851: waiting for pending results... 7557 1726882119.63059: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7557 1726882119.63225: in run() - task 12673a56-9f93-ed48-b3a5-000000000118 7557 1726882119.63247: variable 'ansible_search_path' from source: unknown 7557 1726882119.63256: variable 'ansible_search_path' from source: unknown 7557 1726882119.63309: calling self._execute() 7557 1726882119.63425: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882119.63437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882119.63452: variable 'omit' from source: magic vars 7557 1726882119.63857: variable 'ansible_distribution_major_version' from source: facts 7557 1726882119.63874: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882119.64056: variable 'network_provider' from source: set_fact 7557 1726882119.64060: Evaluated conditional (network_provider == "nm"): True 7557 1726882119.64116: variable '__network_wpa_supplicant_required' from source: role '' defaults 7557 1726882119.64221: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7557 1726882119.64408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882119.66600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882119.66674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882119.66719: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882119.66779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882119.66797: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882119.66889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882119.67001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882119.67004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882119.67007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882119.67014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882119.67065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882119.67097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882119.67214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882119.67217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882119.67220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882119.67234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882119.67256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882119.67278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882119.67329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882119.67349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882119.67503: variable 'network_connections' from source: task vars 7557 1726882119.67519: variable 'interface' from source: play vars 7557 1726882119.67599: variable 'interface' from source: play vars 7557 1726882119.67681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882119.67856: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882119.67911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882119.67947: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882119.67986: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882119.68085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7557 1726882119.68088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7557 1726882119.68096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882119.68128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7557 1726882119.68178: variable '__network_wireless_connections_defined' from source: role '' defaults 7557 1726882119.68447: variable 'network_connections' from source: task vars 7557 1726882119.68458: variable 'interface' from source: play vars 7557 1726882119.68532: variable 'interface' from source: play vars 7557 1726882119.68568: Evaluated conditional (__network_wpa_supplicant_required): False 7557 1726882119.68627: when evaluation is False, skipping this task 7557 1726882119.68630: _execute() done 7557 1726882119.68632: dumping result to json 7557 1726882119.68634: done dumping result, returning 7557 1726882119.68636: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-ed48-b3a5-000000000118] 7557 1726882119.68646: sending task result for task 12673a56-9f93-ed48-b3a5-000000000118 7557 1726882119.69017: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000118 7557 1726882119.69020: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7557 1726882119.69062: no more pending results, returning what we have 7557 1726882119.69065: results queue empty 7557 1726882119.69066: checking for any_errors_fatal 7557 1726882119.69088: done checking for any_errors_fatal 7557 1726882119.69088: checking for max_fail_percentage 7557 1726882119.69090: done checking for max_fail_percentage 7557 1726882119.69091: checking to see if all hosts have failed and the running result is not ok 7557 1726882119.69092: done checking to see if all hosts have failed 7557 1726882119.69096: getting the remaining hosts for this loop 7557 1726882119.69098: done getting the remaining hosts for this loop 7557 1726882119.69101: getting the next task for host managed_node3 7557 1726882119.69106: done getting next task for host managed_node3 7557 1726882119.69110: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882119.69113: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882119.69136: getting variables 7557 1726882119.69138: in VariableManager get_vars() 7557 1726882119.69182: Calling all_inventory to load vars for managed_node3 7557 1726882119.69185: Calling groups_inventory to load vars for managed_node3 7557 1726882119.69187: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882119.69246: Calling all_plugins_play to load vars for managed_node3 7557 1726882119.69250: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882119.69254: Calling groups_plugins_play to load vars for managed_node3 7557 1726882119.70659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882119.72275: done with get_vars() 7557 1726882119.72311: done getting variables 7557 1726882119.72369: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:28:39 -0400 (0:00:00.100) 0:00:45.577 ****** 7557 1726882119.72407: entering _queue_task() for managed_node3/service 7557 1726882119.72817: worker is 1 (out of 1 available) 7557 1726882119.72829: exiting _queue_task() for managed_node3/service 7557 1726882119.72841: done queuing things up, now waiting for results queue to drain 7557 1726882119.72843: waiting for pending results... 7557 1726882119.73097: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7557 1726882119.73247: in run() - task 12673a56-9f93-ed48-b3a5-000000000119 7557 1726882119.73269: variable 'ansible_search_path' from source: unknown 7557 1726882119.73305: variable 'ansible_search_path' from source: unknown 7557 1726882119.73605: calling self._execute() 7557 1726882119.73652: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882119.73663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882119.73678: variable 'omit' from source: magic vars 7557 1726882119.74802: variable 'ansible_distribution_major_version' from source: facts 7557 1726882119.74806: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882119.74808: variable 'network_provider' from source: set_fact 7557 1726882119.74810: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882119.74812: when evaluation is False, skipping this task 7557 1726882119.74815: _execute() done 7557 1726882119.75130: dumping result to json 7557 1726882119.75140: done dumping result, returning 7557 1726882119.75151: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-ed48-b3a5-000000000119] 7557 1726882119.75162: sending task result for task 12673a56-9f93-ed48-b3a5-000000000119 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7557 1726882119.75346: no more pending results, returning what we have 7557 1726882119.75350: results queue empty 7557 1726882119.75352: checking for any_errors_fatal 7557 1726882119.75362: done checking for any_errors_fatal 7557 1726882119.75363: checking for max_fail_percentage 7557 1726882119.75365: done checking for max_fail_percentage 7557 1726882119.75366: checking to see if all hosts have failed and the running result is not ok 7557 1726882119.75367: done checking to see if all hosts have failed 7557 1726882119.75368: getting the remaining hosts for this loop 7557 1726882119.75370: done getting the remaining hosts for this loop 7557 1726882119.75374: getting the next task for host managed_node3 7557 1726882119.75380: done getting next task for host managed_node3 7557 1726882119.75385: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882119.75389: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882119.75420: getting variables 7557 1726882119.75422: in VariableManager get_vars() 7557 1726882119.75472: Calling all_inventory to load vars for managed_node3 7557 1726882119.75475: Calling groups_inventory to load vars for managed_node3 7557 1726882119.75477: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882119.75489: Calling all_plugins_play to load vars for managed_node3 7557 1726882119.75796: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882119.75803: Calling groups_plugins_play to load vars for managed_node3 7557 1726882119.77003: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000119 7557 1726882119.77006: WORKER PROCESS EXITING 7557 1726882119.78923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882119.82506: done with get_vars() 7557 1726882119.82538: done getting variables 7557 1726882119.82706: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:28:39 -0400 (0:00:00.103) 0:00:45.680 ****** 7557 1726882119.82743: entering _queue_task() for managed_node3/copy 7557 1726882119.83531: worker is 1 (out of 1 available) 7557 1726882119.83542: exiting _queue_task() for managed_node3/copy 7557 1726882119.83553: done queuing things up, now waiting for results queue to drain 7557 1726882119.83555: waiting for pending results... 7557 1726882119.83960: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7557 1726882119.84212: in run() - task 12673a56-9f93-ed48-b3a5-00000000011a 7557 1726882119.84303: variable 'ansible_search_path' from source: unknown 7557 1726882119.84307: variable 'ansible_search_path' from source: unknown 7557 1726882119.84350: calling self._execute() 7557 1726882119.84500: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882119.84506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882119.84517: variable 'omit' from source: magic vars 7557 1726882119.85408: variable 'ansible_distribution_major_version' from source: facts 7557 1726882119.85423: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882119.85654: variable 'network_provider' from source: set_fact 7557 1726882119.85660: Evaluated conditional (network_provider == "initscripts"): False 7557 1726882119.85663: when evaluation is False, skipping this task 7557 1726882119.85748: _execute() done 7557 1726882119.85751: dumping result to json 7557 1726882119.85753: done dumping result, returning 7557 1726882119.85761: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-ed48-b3a5-00000000011a] 7557 1726882119.85764: sending task result for task 12673a56-9f93-ed48-b3a5-00000000011a 7557 1726882119.85870: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000011a 7557 1726882119.85873: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7557 1726882119.85925: no more pending results, returning what we have 7557 1726882119.85930: results queue empty 7557 1726882119.85931: checking for any_errors_fatal 7557 1726882119.85940: done checking for any_errors_fatal 7557 1726882119.85941: checking for max_fail_percentage 7557 1726882119.85943: done checking for max_fail_percentage 7557 1726882119.85944: checking to see if all hosts have failed and the running result is not ok 7557 1726882119.85945: done checking to see if all hosts have failed 7557 1726882119.85946: getting the remaining hosts for this loop 7557 1726882119.85948: done getting the remaining hosts for this loop 7557 1726882119.85951: getting the next task for host managed_node3 7557 1726882119.85959: done getting next task for host managed_node3 7557 1726882119.85962: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882119.85967: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882119.85997: getting variables 7557 1726882119.85999: in VariableManager get_vars() 7557 1726882119.86048: Calling all_inventory to load vars for managed_node3 7557 1726882119.86051: Calling groups_inventory to load vars for managed_node3 7557 1726882119.86053: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882119.86064: Calling all_plugins_play to load vars for managed_node3 7557 1726882119.86066: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882119.86069: Calling groups_plugins_play to load vars for managed_node3 7557 1726882119.88671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882119.92037: done with get_vars() 7557 1726882119.92071: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:28:39 -0400 (0:00:00.096) 0:00:45.776 ****** 7557 1726882119.92370: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882119.93138: worker is 1 (out of 1 available) 7557 1726882119.93151: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7557 1726882119.93163: done queuing things up, now waiting for results queue to drain 7557 1726882119.93165: waiting for pending results... 7557 1726882119.93504: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7557 1726882119.93952: in run() - task 12673a56-9f93-ed48-b3a5-00000000011b 7557 1726882119.93956: variable 'ansible_search_path' from source: unknown 7557 1726882119.93958: variable 'ansible_search_path' from source: unknown 7557 1726882119.94005: calling self._execute() 7557 1726882119.94278: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882119.94281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882119.94296: variable 'omit' from source: magic vars 7557 1726882119.95027: variable 'ansible_distribution_major_version' from source: facts 7557 1726882119.95066: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882119.95419: variable 'omit' from source: magic vars 7557 1726882119.95422: variable 'omit' from source: magic vars 7557 1726882119.95559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882120.00269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882120.00343: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882120.00388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882120.00430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882120.00463: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882120.00547: variable 'network_provider' from source: set_fact 7557 1726882120.00680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882120.00716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882120.00747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882120.00795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882120.00817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882120.00898: variable 'omit' from source: magic vars 7557 1726882120.01019: variable 'omit' from source: magic vars 7557 1726882120.01127: variable 'network_connections' from source: task vars 7557 1726882120.01142: variable 'interface' from source: play vars 7557 1726882120.01209: variable 'interface' from source: play vars 7557 1726882120.01368: variable 'omit' from source: magic vars 7557 1726882120.01371: variable '__lsr_ansible_managed' from source: task vars 7557 1726882120.01427: variable '__lsr_ansible_managed' from source: task vars 7557 1726882120.01800: Loaded config def from plugin (lookup/template) 7557 1726882120.01804: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7557 1726882120.01806: File lookup term: get_ansible_managed.j2 7557 1726882120.01808: variable 'ansible_search_path' from source: unknown 7557 1726882120.01810: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7557 1726882120.01814: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7557 1726882120.01817: variable 'ansible_search_path' from source: unknown 7557 1726882120.08964: variable 'ansible_managed' from source: unknown 7557 1726882120.09110: variable 'omit' from source: magic vars 7557 1726882120.09152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882120.09183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882120.09209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882120.09234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.09255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.09285: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882120.09295: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.09303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.09412: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882120.09425: Set connection var ansible_shell_executable to /bin/sh 7557 1726882120.09433: Set connection var ansible_shell_type to sh 7557 1726882120.09463: Set connection var ansible_pipelining to False 7557 1726882120.09466: Set connection var ansible_connection to ssh 7557 1726882120.09468: Set connection var ansible_timeout to 10 7557 1726882120.09488: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.09499: variable 'ansible_connection' from source: unknown 7557 1726882120.09572: variable 'ansible_module_compression' from source: unknown 7557 1726882120.09575: variable 'ansible_shell_type' from source: unknown 7557 1726882120.09577: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.09580: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.09582: variable 'ansible_pipelining' from source: unknown 7557 1726882120.09584: variable 'ansible_timeout' from source: unknown 7557 1726882120.09586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.09685: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882120.09711: variable 'omit' from source: magic vars 7557 1726882120.09720: starting attempt loop 7557 1726882120.09727: running the handler 7557 1726882120.09743: _low_level_execute_command(): starting 7557 1726882120.09753: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882120.10567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.10586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.10789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.10860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.12637: stdout chunk (state=3): >>>/root <<< 7557 1726882120.12879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.12883: stdout chunk (state=3): >>><<< 7557 1726882120.12885: stderr chunk (state=3): >>><<< 7557 1726882120.12888: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882120.12890: _low_level_execute_command(): starting 7557 1726882120.12895: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412 `" && echo ansible-tmp-1726882120.128172-9325-169443697377412="` echo /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412 `" ) && sleep 0' 7557 1726882120.14199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882120.14204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.14206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882120.14209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882120.14211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882120.14213: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882120.14215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.14217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882120.14445: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.14449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.14452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.14454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.16417: stdout chunk (state=3): >>>ansible-tmp-1726882120.128172-9325-169443697377412=/root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412 <<< 7557 1726882120.16662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.16666: stdout chunk (state=3): >>><<< 7557 1726882120.16673: stderr chunk (state=3): >>><<< 7557 1726882120.16691: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882120.128172-9325-169443697377412=/root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882120.16741: variable 'ansible_module_compression' from source: unknown 7557 1726882120.16784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7557 1726882120.16835: variable 'ansible_facts' from source: unknown 7557 1726882120.17180: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/AnsiballZ_network_connections.py 7557 1726882120.17823: Sending initial data 7557 1726882120.17827: Sent initial data (165 bytes) 7557 1726882120.18995: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.19118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.19131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.19247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.20769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7557 1726882120.20781: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882120.20815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882120.20902: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp5ug014ml /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/AnsiballZ_network_connections.py <<< 7557 1726882120.20912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/AnsiballZ_network_connections.py" <<< 7557 1726882120.20949: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp5ug014ml" to remote "/root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/AnsiballZ_network_connections.py" <<< 7557 1726882120.20987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/AnsiballZ_network_connections.py" <<< 7557 1726882120.22573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.22601: stderr chunk (state=3): >>><<< 7557 1726882120.22604: stdout chunk (state=3): >>><<< 7557 1726882120.22709: done transferring module to remote 7557 1726882120.22713: _low_level_execute_command(): starting 7557 1726882120.22715: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/ /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/AnsiballZ_network_connections.py && sleep 0' 7557 1726882120.23826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.23839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.23850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.24127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.24138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.24141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.24257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.25881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.26026: stderr chunk (state=3): >>><<< 7557 1726882120.26029: stdout chunk (state=3): >>><<< 7557 1726882120.26143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882120.26146: _low_level_execute_command(): starting 7557 1726882120.26148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/AnsiballZ_network_connections.py && sleep 0' 7557 1726882120.26806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.26836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882120.26842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.26845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.26848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882120.26850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.26952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.26955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.26957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.27072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.57344: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wld1z_t9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 7557 1726882120.57355: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wld1z_t9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/e060611b-cbb5-4b12-af25-a4b709ff9a49: error=unknown <<< 7557 1726882120.57537: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7557 1726882120.59354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882120.59374: stderr chunk (state=3): >>><<< 7557 1726882120.59377: stdout chunk (state=3): >>><<< 7557 1726882120.59397: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wld1z_t9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_wld1z_t9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/e060611b-cbb5-4b12-af25-a4b709ff9a49: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882120.59429: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882120.59437: _low_level_execute_command(): starting 7557 1726882120.59441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882120.128172-9325-169443697377412/ > /dev/null 2>&1 && sleep 0' 7557 1726882120.59866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.59870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.59872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.59874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882120.59876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.59925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.59928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.59935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.59977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.61774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.61802: stderr chunk (state=3): >>><<< 7557 1726882120.61805: stdout chunk (state=3): >>><<< 7557 1726882120.61820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882120.61826: handler run complete 7557 1726882120.61844: attempt loop complete, returning result 7557 1726882120.61847: _execute() done 7557 1726882120.61849: dumping result to json 7557 1726882120.61854: done dumping result, returning 7557 1726882120.61864: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-ed48-b3a5-00000000011b] 7557 1726882120.61869: sending task result for task 12673a56-9f93-ed48-b3a5-00000000011b 7557 1726882120.61969: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000011b 7557 1726882120.61972: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7557 1726882120.62100: no more pending results, returning what we have 7557 1726882120.62104: results queue empty 7557 1726882120.62105: checking for any_errors_fatal 7557 1726882120.62112: done checking for any_errors_fatal 7557 1726882120.62113: checking for max_fail_percentage 7557 1726882120.62115: done checking for max_fail_percentage 7557 1726882120.62116: checking to see if all hosts have failed and the running result is not ok 7557 1726882120.62117: done checking to see if all hosts have failed 7557 1726882120.62117: getting the remaining hosts for this loop 7557 1726882120.62119: done getting the remaining hosts for this loop 7557 1726882120.62122: getting the next task for host managed_node3 7557 1726882120.62127: done getting next task for host managed_node3 7557 1726882120.62131: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882120.62134: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882120.62145: getting variables 7557 1726882120.62147: in VariableManager get_vars() 7557 1726882120.62190: Calling all_inventory to load vars for managed_node3 7557 1726882120.62200: Calling groups_inventory to load vars for managed_node3 7557 1726882120.62203: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882120.62213: Calling all_plugins_play to load vars for managed_node3 7557 1726882120.62215: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882120.62218: Calling groups_plugins_play to load vars for managed_node3 7557 1726882120.63122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882120.63974: done with get_vars() 7557 1726882120.63989: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:28:40 -0400 (0:00:00.716) 0:00:46.493 ****** 7557 1726882120.64053: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882120.64285: worker is 1 (out of 1 available) 7557 1726882120.64302: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7557 1726882120.64315: done queuing things up, now waiting for results queue to drain 7557 1726882120.64317: waiting for pending results... 7557 1726882120.64495: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7557 1726882120.64592: in run() - task 12673a56-9f93-ed48-b3a5-00000000011c 7557 1726882120.64607: variable 'ansible_search_path' from source: unknown 7557 1726882120.64611: variable 'ansible_search_path' from source: unknown 7557 1726882120.64639: calling self._execute() 7557 1726882120.64719: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.64723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.64733: variable 'omit' from source: magic vars 7557 1726882120.65010: variable 'ansible_distribution_major_version' from source: facts 7557 1726882120.65020: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882120.65105: variable 'network_state' from source: role '' defaults 7557 1726882120.65113: Evaluated conditional (network_state != {}): False 7557 1726882120.65116: when evaluation is False, skipping this task 7557 1726882120.65119: _execute() done 7557 1726882120.65121: dumping result to json 7557 1726882120.65123: done dumping result, returning 7557 1726882120.65131: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-ed48-b3a5-00000000011c] 7557 1726882120.65135: sending task result for task 12673a56-9f93-ed48-b3a5-00000000011c 7557 1726882120.65230: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000011c 7557 1726882120.65232: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7557 1726882120.65281: no more pending results, returning what we have 7557 1726882120.65285: results queue empty 7557 1726882120.65287: checking for any_errors_fatal 7557 1726882120.65298: done checking for any_errors_fatal 7557 1726882120.65299: checking for max_fail_percentage 7557 1726882120.65300: done checking for max_fail_percentage 7557 1726882120.65301: checking to see if all hosts have failed and the running result is not ok 7557 1726882120.65302: done checking to see if all hosts have failed 7557 1726882120.65304: getting the remaining hosts for this loop 7557 1726882120.65306: done getting the remaining hosts for this loop 7557 1726882120.65309: getting the next task for host managed_node3 7557 1726882120.65314: done getting next task for host managed_node3 7557 1726882120.65318: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882120.65321: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882120.65339: getting variables 7557 1726882120.65340: in VariableManager get_vars() 7557 1726882120.65378: Calling all_inventory to load vars for managed_node3 7557 1726882120.65381: Calling groups_inventory to load vars for managed_node3 7557 1726882120.65383: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882120.65391: Calling all_plugins_play to load vars for managed_node3 7557 1726882120.65400: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882120.65404: Calling groups_plugins_play to load vars for managed_node3 7557 1726882120.66151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882120.67104: done with get_vars() 7557 1726882120.67119: done getting variables 7557 1726882120.67161: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:28:40 -0400 (0:00:00.031) 0:00:46.524 ****** 7557 1726882120.67185: entering _queue_task() for managed_node3/debug 7557 1726882120.67400: worker is 1 (out of 1 available) 7557 1726882120.67417: exiting _queue_task() for managed_node3/debug 7557 1726882120.67429: done queuing things up, now waiting for results queue to drain 7557 1726882120.67431: waiting for pending results... 7557 1726882120.67609: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7557 1726882120.67692: in run() - task 12673a56-9f93-ed48-b3a5-00000000011d 7557 1726882120.67709: variable 'ansible_search_path' from source: unknown 7557 1726882120.67712: variable 'ansible_search_path' from source: unknown 7557 1726882120.67739: calling self._execute() 7557 1726882120.67820: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.67824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.67834: variable 'omit' from source: magic vars 7557 1726882120.68103: variable 'ansible_distribution_major_version' from source: facts 7557 1726882120.68116: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882120.68119: variable 'omit' from source: magic vars 7557 1726882120.68155: variable 'omit' from source: magic vars 7557 1726882120.68179: variable 'omit' from source: magic vars 7557 1726882120.68217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882120.68245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882120.68260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882120.68273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.68284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.68314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882120.68317: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.68319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.68389: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882120.68399: Set connection var ansible_shell_executable to /bin/sh 7557 1726882120.68402: Set connection var ansible_shell_type to sh 7557 1726882120.68407: Set connection var ansible_pipelining to False 7557 1726882120.68410: Set connection var ansible_connection to ssh 7557 1726882120.68416: Set connection var ansible_timeout to 10 7557 1726882120.68432: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.68435: variable 'ansible_connection' from source: unknown 7557 1726882120.68439: variable 'ansible_module_compression' from source: unknown 7557 1726882120.68442: variable 'ansible_shell_type' from source: unknown 7557 1726882120.68444: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.68446: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.68448: variable 'ansible_pipelining' from source: unknown 7557 1726882120.68450: variable 'ansible_timeout' from source: unknown 7557 1726882120.68458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.68557: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882120.68569: variable 'omit' from source: magic vars 7557 1726882120.68572: starting attempt loop 7557 1726882120.68574: running the handler 7557 1726882120.68664: variable '__network_connections_result' from source: set_fact 7557 1726882120.68709: handler run complete 7557 1726882120.68722: attempt loop complete, returning result 7557 1726882120.68724: _execute() done 7557 1726882120.68727: dumping result to json 7557 1726882120.68730: done dumping result, returning 7557 1726882120.68739: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-ed48-b3a5-00000000011d] 7557 1726882120.68742: sending task result for task 12673a56-9f93-ed48-b3a5-00000000011d 7557 1726882120.68823: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000011d 7557 1726882120.68826: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7557 1726882120.68914: no more pending results, returning what we have 7557 1726882120.68917: results queue empty 7557 1726882120.68918: checking for any_errors_fatal 7557 1726882120.68923: done checking for any_errors_fatal 7557 1726882120.68924: checking for max_fail_percentage 7557 1726882120.68925: done checking for max_fail_percentage 7557 1726882120.68926: checking to see if all hosts have failed and the running result is not ok 7557 1726882120.68927: done checking to see if all hosts have failed 7557 1726882120.68927: getting the remaining hosts for this loop 7557 1726882120.68929: done getting the remaining hosts for this loop 7557 1726882120.68932: getting the next task for host managed_node3 7557 1726882120.68939: done getting next task for host managed_node3 7557 1726882120.68942: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882120.68945: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882120.68955: getting variables 7557 1726882120.68957: in VariableManager get_vars() 7557 1726882120.68997: Calling all_inventory to load vars for managed_node3 7557 1726882120.68999: Calling groups_inventory to load vars for managed_node3 7557 1726882120.69001: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882120.69010: Calling all_plugins_play to load vars for managed_node3 7557 1726882120.69012: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882120.69014: Calling groups_plugins_play to load vars for managed_node3 7557 1726882120.69748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882120.70588: done with get_vars() 7557 1726882120.70605: done getting variables 7557 1726882120.70642: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:28:40 -0400 (0:00:00.034) 0:00:46.559 ****** 7557 1726882120.70665: entering _queue_task() for managed_node3/debug 7557 1726882120.70875: worker is 1 (out of 1 available) 7557 1726882120.70889: exiting _queue_task() for managed_node3/debug 7557 1726882120.70904: done queuing things up, now waiting for results queue to drain 7557 1726882120.70905: waiting for pending results... 7557 1726882120.71078: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7557 1726882120.71173: in run() - task 12673a56-9f93-ed48-b3a5-00000000011e 7557 1726882120.71186: variable 'ansible_search_path' from source: unknown 7557 1726882120.71190: variable 'ansible_search_path' from source: unknown 7557 1726882120.71223: calling self._execute() 7557 1726882120.71302: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.71308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.71317: variable 'omit' from source: magic vars 7557 1726882120.71584: variable 'ansible_distribution_major_version' from source: facts 7557 1726882120.71595: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882120.71603: variable 'omit' from source: magic vars 7557 1726882120.71646: variable 'omit' from source: magic vars 7557 1726882120.71672: variable 'omit' from source: magic vars 7557 1726882120.71708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882120.71734: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882120.71749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882120.71763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.71773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.71803: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882120.71807: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.71809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.71878: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882120.71884: Set connection var ansible_shell_executable to /bin/sh 7557 1726882120.71887: Set connection var ansible_shell_type to sh 7557 1726882120.71898: Set connection var ansible_pipelining to False 7557 1726882120.71900: Set connection var ansible_connection to ssh 7557 1726882120.71902: Set connection var ansible_timeout to 10 7557 1726882120.71922: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.71925: variable 'ansible_connection' from source: unknown 7557 1726882120.71928: variable 'ansible_module_compression' from source: unknown 7557 1726882120.71930: variable 'ansible_shell_type' from source: unknown 7557 1726882120.71933: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.71935: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.71937: variable 'ansible_pipelining' from source: unknown 7557 1726882120.71940: variable 'ansible_timeout' from source: unknown 7557 1726882120.71944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.72051: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882120.72060: variable 'omit' from source: magic vars 7557 1726882120.72065: starting attempt loop 7557 1726882120.72067: running the handler 7557 1726882120.72111: variable '__network_connections_result' from source: set_fact 7557 1726882120.72166: variable '__network_connections_result' from source: set_fact 7557 1726882120.72251: handler run complete 7557 1726882120.72267: attempt loop complete, returning result 7557 1726882120.72270: _execute() done 7557 1726882120.72273: dumping result to json 7557 1726882120.72275: done dumping result, returning 7557 1726882120.72283: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-ed48-b3a5-00000000011e] 7557 1726882120.72288: sending task result for task 12673a56-9f93-ed48-b3a5-00000000011e 7557 1726882120.72378: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000011e 7557 1726882120.72381: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7557 1726882120.72465: no more pending results, returning what we have 7557 1726882120.72469: results queue empty 7557 1726882120.72470: checking for any_errors_fatal 7557 1726882120.72477: done checking for any_errors_fatal 7557 1726882120.72478: checking for max_fail_percentage 7557 1726882120.72479: done checking for max_fail_percentage 7557 1726882120.72480: checking to see if all hosts have failed and the running result is not ok 7557 1726882120.72481: done checking to see if all hosts have failed 7557 1726882120.72482: getting the remaining hosts for this loop 7557 1726882120.72483: done getting the remaining hosts for this loop 7557 1726882120.72486: getting the next task for host managed_node3 7557 1726882120.72497: done getting next task for host managed_node3 7557 1726882120.72501: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882120.72503: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882120.72515: getting variables 7557 1726882120.72516: in VariableManager get_vars() 7557 1726882120.72557: Calling all_inventory to load vars for managed_node3 7557 1726882120.72559: Calling groups_inventory to load vars for managed_node3 7557 1726882120.72561: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882120.72570: Calling all_plugins_play to load vars for managed_node3 7557 1726882120.72572: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882120.72575: Calling groups_plugins_play to load vars for managed_node3 7557 1726882120.73483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882120.74333: done with get_vars() 7557 1726882120.74352: done getting variables 7557 1726882120.74398: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:28:40 -0400 (0:00:00.037) 0:00:46.597 ****** 7557 1726882120.74429: entering _queue_task() for managed_node3/debug 7557 1726882120.74682: worker is 1 (out of 1 available) 7557 1726882120.74697: exiting _queue_task() for managed_node3/debug 7557 1726882120.74711: done queuing things up, now waiting for results queue to drain 7557 1726882120.74712: waiting for pending results... 7557 1726882120.74909: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7557 1726882120.75016: in run() - task 12673a56-9f93-ed48-b3a5-00000000011f 7557 1726882120.75031: variable 'ansible_search_path' from source: unknown 7557 1726882120.75035: variable 'ansible_search_path' from source: unknown 7557 1726882120.75063: calling self._execute() 7557 1726882120.75146: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.75153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.75161: variable 'omit' from source: magic vars 7557 1726882120.75435: variable 'ansible_distribution_major_version' from source: facts 7557 1726882120.75444: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882120.75532: variable 'network_state' from source: role '' defaults 7557 1726882120.75540: Evaluated conditional (network_state != {}): False 7557 1726882120.75543: when evaluation is False, skipping this task 7557 1726882120.75546: _execute() done 7557 1726882120.75549: dumping result to json 7557 1726882120.75551: done dumping result, returning 7557 1726882120.75560: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-ed48-b3a5-00000000011f] 7557 1726882120.75566: sending task result for task 12673a56-9f93-ed48-b3a5-00000000011f 7557 1726882120.75652: done sending task result for task 12673a56-9f93-ed48-b3a5-00000000011f 7557 1726882120.75654: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7557 1726882120.75700: no more pending results, returning what we have 7557 1726882120.75704: results queue empty 7557 1726882120.75705: checking for any_errors_fatal 7557 1726882120.75716: done checking for any_errors_fatal 7557 1726882120.75717: checking for max_fail_percentage 7557 1726882120.75719: done checking for max_fail_percentage 7557 1726882120.75719: checking to see if all hosts have failed and the running result is not ok 7557 1726882120.75720: done checking to see if all hosts have failed 7557 1726882120.75721: getting the remaining hosts for this loop 7557 1726882120.75722: done getting the remaining hosts for this loop 7557 1726882120.75726: getting the next task for host managed_node3 7557 1726882120.75732: done getting next task for host managed_node3 7557 1726882120.75736: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882120.75742: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882120.75763: getting variables 7557 1726882120.75765: in VariableManager get_vars() 7557 1726882120.75811: Calling all_inventory to load vars for managed_node3 7557 1726882120.75814: Calling groups_inventory to load vars for managed_node3 7557 1726882120.75816: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882120.75826: Calling all_plugins_play to load vars for managed_node3 7557 1726882120.75829: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882120.75831: Calling groups_plugins_play to load vars for managed_node3 7557 1726882120.76619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882120.77471: done with get_vars() 7557 1726882120.77488: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:28:40 -0400 (0:00:00.031) 0:00:46.628 ****** 7557 1726882120.77561: entering _queue_task() for managed_node3/ping 7557 1726882120.77810: worker is 1 (out of 1 available) 7557 1726882120.77826: exiting _queue_task() for managed_node3/ping 7557 1726882120.77838: done queuing things up, now waiting for results queue to drain 7557 1726882120.77840: waiting for pending results... 7557 1726882120.78029: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7557 1726882120.78122: in run() - task 12673a56-9f93-ed48-b3a5-000000000120 7557 1726882120.78133: variable 'ansible_search_path' from source: unknown 7557 1726882120.78136: variable 'ansible_search_path' from source: unknown 7557 1726882120.78165: calling self._execute() 7557 1726882120.78249: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.78253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.78262: variable 'omit' from source: magic vars 7557 1726882120.78540: variable 'ansible_distribution_major_version' from source: facts 7557 1726882120.78549: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882120.78555: variable 'omit' from source: magic vars 7557 1726882120.78597: variable 'omit' from source: magic vars 7557 1726882120.78626: variable 'omit' from source: magic vars 7557 1726882120.78658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882120.78684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882120.78703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882120.78722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.78730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882120.78753: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882120.78756: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.78759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.78837: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882120.78843: Set connection var ansible_shell_executable to /bin/sh 7557 1726882120.78846: Set connection var ansible_shell_type to sh 7557 1726882120.78851: Set connection var ansible_pipelining to False 7557 1726882120.78853: Set connection var ansible_connection to ssh 7557 1726882120.78858: Set connection var ansible_timeout to 10 7557 1726882120.78875: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.78878: variable 'ansible_connection' from source: unknown 7557 1726882120.78880: variable 'ansible_module_compression' from source: unknown 7557 1726882120.78883: variable 'ansible_shell_type' from source: unknown 7557 1726882120.78885: variable 'ansible_shell_executable' from source: unknown 7557 1726882120.78887: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882120.78891: variable 'ansible_pipelining' from source: unknown 7557 1726882120.78895: variable 'ansible_timeout' from source: unknown 7557 1726882120.78901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882120.79050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 7557 1726882120.79057: variable 'omit' from source: magic vars 7557 1726882120.79060: starting attempt loop 7557 1726882120.79065: running the handler 7557 1726882120.79076: _low_level_execute_command(): starting 7557 1726882120.79083: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882120.79601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882120.79604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.79607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882120.79611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.79663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.79667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.79669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.79722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.81296: stdout chunk (state=3): >>>/root <<< 7557 1726882120.81395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.81425: stderr chunk (state=3): >>><<< 7557 1726882120.81429: stdout chunk (state=3): >>><<< 7557 1726882120.81453: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882120.81465: _low_level_execute_command(): starting 7557 1726882120.81471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813 `" && echo ansible-tmp-1726882120.8145335-9356-273377884641813="` echo /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813 `" ) && sleep 0' 7557 1726882120.81927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.81931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882120.81933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.81942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882120.81945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882120.81947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.82002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.82006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882120.82009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.82046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.83921: stdout chunk (state=3): >>>ansible-tmp-1726882120.8145335-9356-273377884641813=/root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813 <<< 7557 1726882120.84212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.84216: stdout chunk (state=3): >>><<< 7557 1726882120.84219: stderr chunk (state=3): >>><<< 7557 1726882120.84222: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882120.8145335-9356-273377884641813=/root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882120.84224: variable 'ansible_module_compression' from source: unknown 7557 1726882120.84226: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7557 1726882120.84240: variable 'ansible_facts' from source: unknown 7557 1726882120.84342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/AnsiballZ_ping.py 7557 1726882120.84557: Sending initial data 7557 1726882120.84568: Sent initial data (151 bytes) 7557 1726882120.85173: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882120.85211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.85232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882120.85253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882120.85317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.85379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.85426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.85503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.87031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882120.87070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882120.87115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmplcz506e9 /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/AnsiballZ_ping.py <<< 7557 1726882120.87122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/AnsiballZ_ping.py" <<< 7557 1726882120.87163: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmplcz506e9" to remote "/root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/AnsiballZ_ping.py" <<< 7557 1726882120.87171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/AnsiballZ_ping.py" <<< 7557 1726882120.87706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.87750: stderr chunk (state=3): >>><<< 7557 1726882120.87753: stdout chunk (state=3): >>><<< 7557 1726882120.87802: done transferring module to remote 7557 1726882120.87810: _low_level_execute_command(): starting 7557 1726882120.87815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/ /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/AnsiballZ_ping.py && sleep 0' 7557 1726882120.88321: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.88324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882120.88326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.88329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882120.88337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882120.88340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.88384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.88388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.88438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882120.90189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882120.90197: stdout chunk (state=3): >>><<< 7557 1726882120.90299: stderr chunk (state=3): >>><<< 7557 1726882120.90303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882120.90307: _low_level_execute_command(): starting 7557 1726882120.90309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/AnsiballZ_ping.py && sleep 0' 7557 1726882120.90806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882120.90809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882120.90821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.90844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882120.90848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882120.90898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882120.90905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882120.90963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.05931: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7557 1726882121.07223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882121.07227: stdout chunk (state=3): >>><<< 7557 1726882121.07229: stderr chunk (state=3): >>><<< 7557 1726882121.07248: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882121.07280: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882121.07304: _low_level_execute_command(): starting 7557 1726882121.07329: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882120.8145335-9356-273377884641813/ > /dev/null 2>&1 && sleep 0' 7557 1726882121.08015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882121.08032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882121.08049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.08156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882121.08183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882121.08283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.10148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882121.10152: stdout chunk (state=3): >>><<< 7557 1726882121.10155: stderr chunk (state=3): >>><<< 7557 1726882121.10300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882121.10304: handler run complete 7557 1726882121.10306: attempt loop complete, returning result 7557 1726882121.10308: _execute() done 7557 1726882121.10310: dumping result to json 7557 1726882121.10312: done dumping result, returning 7557 1726882121.10314: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-ed48-b3a5-000000000120] 7557 1726882121.10316: sending task result for task 12673a56-9f93-ed48-b3a5-000000000120 7557 1726882121.10383: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000120 7557 1726882121.10386: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7557 1726882121.10461: no more pending results, returning what we have 7557 1726882121.10466: results queue empty 7557 1726882121.10467: checking for any_errors_fatal 7557 1726882121.10475: done checking for any_errors_fatal 7557 1726882121.10476: checking for max_fail_percentage 7557 1726882121.10478: done checking for max_fail_percentage 7557 1726882121.10479: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.10480: done checking to see if all hosts have failed 7557 1726882121.10480: getting the remaining hosts for this loop 7557 1726882121.10482: done getting the remaining hosts for this loop 7557 1726882121.10486: getting the next task for host managed_node3 7557 1726882121.10612: done getting next task for host managed_node3 7557 1726882121.10616: ^ task is: TASK: meta (role_complete) 7557 1726882121.10638: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.10654: getting variables 7557 1726882121.10656: in VariableManager get_vars() 7557 1726882121.10752: Calling all_inventory to load vars for managed_node3 7557 1726882121.10756: Calling groups_inventory to load vars for managed_node3 7557 1726882121.10758: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.10770: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.10773: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.10777: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.19045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.20806: done with get_vars() 7557 1726882121.20842: done getting variables 7557 1726882121.20916: done queuing things up, now waiting for results queue to drain 7557 1726882121.20919: results queue empty 7557 1726882121.20919: checking for any_errors_fatal 7557 1726882121.20923: done checking for any_errors_fatal 7557 1726882121.20924: checking for max_fail_percentage 7557 1726882121.20925: done checking for max_fail_percentage 7557 1726882121.20926: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.20926: done checking to see if all hosts have failed 7557 1726882121.20927: getting the remaining hosts for this loop 7557 1726882121.20928: done getting the remaining hosts for this loop 7557 1726882121.20931: getting the next task for host managed_node3 7557 1726882121.20935: done getting next task for host managed_node3 7557 1726882121.20937: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7557 1726882121.20939: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.20941: getting variables 7557 1726882121.20942: in VariableManager get_vars() 7557 1726882121.20962: Calling all_inventory to load vars for managed_node3 7557 1726882121.20964: Calling groups_inventory to load vars for managed_node3 7557 1726882121.20967: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.20972: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.20974: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.20977: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.22187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.23817: done with get_vars() 7557 1726882121.23845: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:145 Friday 20 September 2024 21:28:41 -0400 (0:00:00.463) 0:00:47.092 ****** 7557 1726882121.23920: entering _queue_task() for managed_node3/include_tasks 7557 1726882121.24319: worker is 1 (out of 1 available) 7557 1726882121.24331: exiting _queue_task() for managed_node3/include_tasks 7557 1726882121.24344: done queuing things up, now waiting for results queue to drain 7557 1726882121.24346: waiting for pending results... 7557 1726882121.24717: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7557 1726882121.24757: in run() - task 12673a56-9f93-ed48-b3a5-000000000150 7557 1726882121.24814: variable 'ansible_search_path' from source: unknown 7557 1726882121.24831: calling self._execute() 7557 1726882121.24947: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.24960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.24974: variable 'omit' from source: magic vars 7557 1726882121.25465: variable 'ansible_distribution_major_version' from source: facts 7557 1726882121.25469: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882121.25472: _execute() done 7557 1726882121.25475: dumping result to json 7557 1726882121.25477: done dumping result, returning 7557 1726882121.25480: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-ed48-b3a5-000000000150] 7557 1726882121.25483: sending task result for task 12673a56-9f93-ed48-b3a5-000000000150 7557 1726882121.25801: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000150 7557 1726882121.25805: WORKER PROCESS EXITING 7557 1726882121.25838: no more pending results, returning what we have 7557 1726882121.25843: in VariableManager get_vars() 7557 1726882121.25911: Calling all_inventory to load vars for managed_node3 7557 1726882121.25917: Calling groups_inventory to load vars for managed_node3 7557 1726882121.25920: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.25935: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.25938: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.25942: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.27658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.29162: done with get_vars() 7557 1726882121.29187: variable 'ansible_search_path' from source: unknown 7557 1726882121.29209: we have included files to process 7557 1726882121.29210: generating all_blocks data 7557 1726882121.29213: done generating all_blocks data 7557 1726882121.29218: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882121.29220: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882121.29223: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7557 1726882121.29665: in VariableManager get_vars() 7557 1726882121.29707: done with get_vars() 7557 1726882121.30415: done processing included file 7557 1726882121.30418: iterating over new_blocks loaded from include file 7557 1726882121.30420: in VariableManager get_vars() 7557 1726882121.30446: done with get_vars() 7557 1726882121.30448: filtering new block on tags 7557 1726882121.30489: done filtering new block on tags 7557 1726882121.30496: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7557 1726882121.30503: extending task lists for all hosts with included blocks 7557 1726882121.35350: done extending task lists 7557 1726882121.35352: done processing included files 7557 1726882121.35352: results queue empty 7557 1726882121.35353: checking for any_errors_fatal 7557 1726882121.35354: done checking for any_errors_fatal 7557 1726882121.35354: checking for max_fail_percentage 7557 1726882121.35355: done checking for max_fail_percentage 7557 1726882121.35356: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.35356: done checking to see if all hosts have failed 7557 1726882121.35357: getting the remaining hosts for this loop 7557 1726882121.35358: done getting the remaining hosts for this loop 7557 1726882121.35359: getting the next task for host managed_node3 7557 1726882121.35362: done getting next task for host managed_node3 7557 1726882121.35364: ^ task is: TASK: Ensure state in ["present", "absent"] 7557 1726882121.35365: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.35367: getting variables 7557 1726882121.35368: in VariableManager get_vars() 7557 1726882121.35384: Calling all_inventory to load vars for managed_node3 7557 1726882121.35388: Calling groups_inventory to load vars for managed_node3 7557 1726882121.35390: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.35401: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.35404: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.35407: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.36660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.37523: done with get_vars() 7557 1726882121.37538: done getting variables 7557 1726882121.37572: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:28:41 -0400 (0:00:00.136) 0:00:47.228 ****** 7557 1726882121.37596: entering _queue_task() for managed_node3/fail 7557 1726882121.37852: worker is 1 (out of 1 available) 7557 1726882121.37864: exiting _queue_task() for managed_node3/fail 7557 1726882121.37877: done queuing things up, now waiting for results queue to drain 7557 1726882121.37878: waiting for pending results... 7557 1726882121.38059: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7557 1726882121.38125: in run() - task 12673a56-9f93-ed48-b3a5-000000001a6f 7557 1726882121.38135: variable 'ansible_search_path' from source: unknown 7557 1726882121.38139: variable 'ansible_search_path' from source: unknown 7557 1726882121.38168: calling self._execute() 7557 1726882121.38253: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.38257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.38265: variable 'omit' from source: magic vars 7557 1726882121.38798: variable 'ansible_distribution_major_version' from source: facts 7557 1726882121.38801: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882121.38804: variable 'state' from source: include params 7557 1726882121.38806: Evaluated conditional (state not in ["present", "absent"]): False 7557 1726882121.38809: when evaluation is False, skipping this task 7557 1726882121.38813: _execute() done 7557 1726882121.38815: dumping result to json 7557 1726882121.38817: done dumping result, returning 7557 1726882121.38820: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-ed48-b3a5-000000001a6f] 7557 1726882121.38823: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a6f 7557 1726882121.38920: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a6f 7557 1726882121.38923: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7557 1726882121.38975: no more pending results, returning what we have 7557 1726882121.38979: results queue empty 7557 1726882121.38981: checking for any_errors_fatal 7557 1726882121.38983: done checking for any_errors_fatal 7557 1726882121.38984: checking for max_fail_percentage 7557 1726882121.38985: done checking for max_fail_percentage 7557 1726882121.38986: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.38987: done checking to see if all hosts have failed 7557 1726882121.38988: getting the remaining hosts for this loop 7557 1726882121.38990: done getting the remaining hosts for this loop 7557 1726882121.38997: getting the next task for host managed_node3 7557 1726882121.39004: done getting next task for host managed_node3 7557 1726882121.39006: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882121.39010: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.39014: getting variables 7557 1726882121.39016: in VariableManager get_vars() 7557 1726882121.39070: Calling all_inventory to load vars for managed_node3 7557 1726882121.39074: Calling groups_inventory to load vars for managed_node3 7557 1726882121.39077: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.39091: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.39215: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.39220: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.40326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.41183: done with get_vars() 7557 1726882121.41202: done getting variables 7557 1726882121.41245: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:28:41 -0400 (0:00:00.036) 0:00:47.265 ****** 7557 1726882121.41273: entering _queue_task() for managed_node3/fail 7557 1726882121.41558: worker is 1 (out of 1 available) 7557 1726882121.41570: exiting _queue_task() for managed_node3/fail 7557 1726882121.41584: done queuing things up, now waiting for results queue to drain 7557 1726882121.41585: waiting for pending results... 7557 1726882121.41880: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7557 1726882121.42026: in run() - task 12673a56-9f93-ed48-b3a5-000000001a70 7557 1726882121.42031: variable 'ansible_search_path' from source: unknown 7557 1726882121.42033: variable 'ansible_search_path' from source: unknown 7557 1726882121.42071: calling self._execute() 7557 1726882121.42200: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.42203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.42206: variable 'omit' from source: magic vars 7557 1726882121.42598: variable 'ansible_distribution_major_version' from source: facts 7557 1726882121.42612: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882121.42799: variable 'type' from source: play vars 7557 1726882121.42802: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7557 1726882121.42805: when evaluation is False, skipping this task 7557 1726882121.42807: _execute() done 7557 1726882121.42810: dumping result to json 7557 1726882121.42812: done dumping result, returning 7557 1726882121.42815: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-ed48-b3a5-000000001a70] 7557 1726882121.42817: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a70 skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7557 1726882121.43135: no more pending results, returning what we have 7557 1726882121.43138: results queue empty 7557 1726882121.43139: checking for any_errors_fatal 7557 1726882121.43146: done checking for any_errors_fatal 7557 1726882121.43147: checking for max_fail_percentage 7557 1726882121.43149: done checking for max_fail_percentage 7557 1726882121.43149: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.43150: done checking to see if all hosts have failed 7557 1726882121.43151: getting the remaining hosts for this loop 7557 1726882121.43152: done getting the remaining hosts for this loop 7557 1726882121.43155: getting the next task for host managed_node3 7557 1726882121.43161: done getting next task for host managed_node3 7557 1726882121.43163: ^ task is: TASK: Include the task 'show_interfaces.yml' 7557 1726882121.43166: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.43169: getting variables 7557 1726882121.43170: in VariableManager get_vars() 7557 1726882121.43215: Calling all_inventory to load vars for managed_node3 7557 1726882121.43217: Calling groups_inventory to load vars for managed_node3 7557 1726882121.43221: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.43230: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.43233: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.43236: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.44304: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a70 7557 1726882121.44308: WORKER PROCESS EXITING 7557 1726882121.44319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.45174: done with get_vars() 7557 1726882121.45189: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:28:41 -0400 (0:00:00.039) 0:00:47.305 ****** 7557 1726882121.45259: entering _queue_task() for managed_node3/include_tasks 7557 1726882121.45570: worker is 1 (out of 1 available) 7557 1726882121.45581: exiting _queue_task() for managed_node3/include_tasks 7557 1726882121.45599: done queuing things up, now waiting for results queue to drain 7557 1726882121.45601: waiting for pending results... 7557 1726882121.46014: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7557 1726882121.46020: in run() - task 12673a56-9f93-ed48-b3a5-000000001a71 7557 1726882121.46042: variable 'ansible_search_path' from source: unknown 7557 1726882121.46056: variable 'ansible_search_path' from source: unknown 7557 1726882121.46102: calling self._execute() 7557 1726882121.46243: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.46281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.46286: variable 'omit' from source: magic vars 7557 1726882121.46576: variable 'ansible_distribution_major_version' from source: facts 7557 1726882121.46589: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882121.46592: _execute() done 7557 1726882121.46598: dumping result to json 7557 1726882121.46602: done dumping result, returning 7557 1726882121.46608: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-ed48-b3a5-000000001a71] 7557 1726882121.46610: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a71 7557 1726882121.46699: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a71 7557 1726882121.46702: WORKER PROCESS EXITING 7557 1726882121.46728: no more pending results, returning what we have 7557 1726882121.46733: in VariableManager get_vars() 7557 1726882121.46789: Calling all_inventory to load vars for managed_node3 7557 1726882121.46792: Calling groups_inventory to load vars for managed_node3 7557 1726882121.46798: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.46810: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.46813: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.46815: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.47602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.48555: done with get_vars() 7557 1726882121.48568: variable 'ansible_search_path' from source: unknown 7557 1726882121.48568: variable 'ansible_search_path' from source: unknown 7557 1726882121.48595: we have included files to process 7557 1726882121.48597: generating all_blocks data 7557 1726882121.48598: done generating all_blocks data 7557 1726882121.48601: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882121.48602: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882121.48603: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7557 1726882121.48671: in VariableManager get_vars() 7557 1726882121.48690: done with get_vars() 7557 1726882121.48769: done processing included file 7557 1726882121.48770: iterating over new_blocks loaded from include file 7557 1726882121.48771: in VariableManager get_vars() 7557 1726882121.48786: done with get_vars() 7557 1726882121.48787: filtering new block on tags 7557 1726882121.48801: done filtering new block on tags 7557 1726882121.48803: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7557 1726882121.48807: extending task lists for all hosts with included blocks 7557 1726882121.49032: done extending task lists 7557 1726882121.49033: done processing included files 7557 1726882121.49034: results queue empty 7557 1726882121.49034: checking for any_errors_fatal 7557 1726882121.49036: done checking for any_errors_fatal 7557 1726882121.49036: checking for max_fail_percentage 7557 1726882121.49037: done checking for max_fail_percentage 7557 1726882121.49038: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.49038: done checking to see if all hosts have failed 7557 1726882121.49039: getting the remaining hosts for this loop 7557 1726882121.49039: done getting the remaining hosts for this loop 7557 1726882121.49041: getting the next task for host managed_node3 7557 1726882121.49043: done getting next task for host managed_node3 7557 1726882121.49045: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7557 1726882121.49047: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.49049: getting variables 7557 1726882121.49049: in VariableManager get_vars() 7557 1726882121.49061: Calling all_inventory to load vars for managed_node3 7557 1726882121.49063: Calling groups_inventory to load vars for managed_node3 7557 1726882121.49065: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.49070: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.49071: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.49073: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.49721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.50565: done with get_vars() 7557 1726882121.50584: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:28:41 -0400 (0:00:00.053) 0:00:47.359 ****** 7557 1726882121.50645: entering _queue_task() for managed_node3/include_tasks 7557 1726882121.50907: worker is 1 (out of 1 available) 7557 1726882121.50920: exiting _queue_task() for managed_node3/include_tasks 7557 1726882121.50935: done queuing things up, now waiting for results queue to drain 7557 1726882121.50936: waiting for pending results... 7557 1726882121.51118: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7557 1726882121.51199: in run() - task 12673a56-9f93-ed48-b3a5-000000001d1c 7557 1726882121.51208: variable 'ansible_search_path' from source: unknown 7557 1726882121.51211: variable 'ansible_search_path' from source: unknown 7557 1726882121.51244: calling self._execute() 7557 1726882121.51326: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.51330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.51339: variable 'omit' from source: magic vars 7557 1726882121.51616: variable 'ansible_distribution_major_version' from source: facts 7557 1726882121.51627: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882121.51632: _execute() done 7557 1726882121.51635: dumping result to json 7557 1726882121.51638: done dumping result, returning 7557 1726882121.51644: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-ed48-b3a5-000000001d1c] 7557 1726882121.51649: sending task result for task 12673a56-9f93-ed48-b3a5-000000001d1c 7557 1726882121.51765: no more pending results, returning what we have 7557 1726882121.51770: in VariableManager get_vars() 7557 1726882121.51828: Calling all_inventory to load vars for managed_node3 7557 1726882121.51831: Calling groups_inventory to load vars for managed_node3 7557 1726882121.51833: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.51845: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.51848: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.51851: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.52748: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001d1c 7557 1726882121.52752: WORKER PROCESS EXITING 7557 1726882121.52762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.53634: done with get_vars() 7557 1726882121.53647: variable 'ansible_search_path' from source: unknown 7557 1726882121.53647: variable 'ansible_search_path' from source: unknown 7557 1726882121.53684: we have included files to process 7557 1726882121.53685: generating all_blocks data 7557 1726882121.53686: done generating all_blocks data 7557 1726882121.53686: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882121.53687: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882121.53688: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7557 1726882121.53865: done processing included file 7557 1726882121.53866: iterating over new_blocks loaded from include file 7557 1726882121.53868: in VariableManager get_vars() 7557 1726882121.53883: done with get_vars() 7557 1726882121.53885: filtering new block on tags 7557 1726882121.53899: done filtering new block on tags 7557 1726882121.53901: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7557 1726882121.53905: extending task lists for all hosts with included blocks 7557 1726882121.53990: done extending task lists 7557 1726882121.53991: done processing included files 7557 1726882121.53992: results queue empty 7557 1726882121.53995: checking for any_errors_fatal 7557 1726882121.53997: done checking for any_errors_fatal 7557 1726882121.53998: checking for max_fail_percentage 7557 1726882121.53999: done checking for max_fail_percentage 7557 1726882121.53999: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.54000: done checking to see if all hosts have failed 7557 1726882121.54000: getting the remaining hosts for this loop 7557 1726882121.54001: done getting the remaining hosts for this loop 7557 1726882121.54002: getting the next task for host managed_node3 7557 1726882121.54005: done getting next task for host managed_node3 7557 1726882121.54007: ^ task is: TASK: Gather current interface info 7557 1726882121.54009: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.54011: getting variables 7557 1726882121.54011: in VariableManager get_vars() 7557 1726882121.54023: Calling all_inventory to load vars for managed_node3 7557 1726882121.54024: Calling groups_inventory to load vars for managed_node3 7557 1726882121.54025: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.54029: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.54030: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.54032: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.54661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.55498: done with get_vars() 7557 1726882121.55512: done getting variables 7557 1726882121.55538: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:28:41 -0400 (0:00:00.049) 0:00:47.408 ****** 7557 1726882121.55560: entering _queue_task() for managed_node3/command 7557 1726882121.55803: worker is 1 (out of 1 available) 7557 1726882121.55816: exiting _queue_task() for managed_node3/command 7557 1726882121.55828: done queuing things up, now waiting for results queue to drain 7557 1726882121.55829: waiting for pending results... 7557 1726882121.56007: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7557 1726882121.56084: in run() - task 12673a56-9f93-ed48-b3a5-000000001d53 7557 1726882121.56170: variable 'ansible_search_path' from source: unknown 7557 1726882121.56174: variable 'ansible_search_path' from source: unknown 7557 1726882121.56178: calling self._execute() 7557 1726882121.56207: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.56212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.56220: variable 'omit' from source: magic vars 7557 1726882121.56500: variable 'ansible_distribution_major_version' from source: facts 7557 1726882121.56511: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882121.56515: variable 'omit' from source: magic vars 7557 1726882121.56548: variable 'omit' from source: magic vars 7557 1726882121.56572: variable 'omit' from source: magic vars 7557 1726882121.56606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882121.56632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882121.56647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882121.56661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882121.56671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882121.56698: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882121.56701: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.56705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.56775: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882121.56781: Set connection var ansible_shell_executable to /bin/sh 7557 1726882121.56784: Set connection var ansible_shell_type to sh 7557 1726882121.56789: Set connection var ansible_pipelining to False 7557 1726882121.56791: Set connection var ansible_connection to ssh 7557 1726882121.56798: Set connection var ansible_timeout to 10 7557 1726882121.56816: variable 'ansible_shell_executable' from source: unknown 7557 1726882121.56819: variable 'ansible_connection' from source: unknown 7557 1726882121.56822: variable 'ansible_module_compression' from source: unknown 7557 1726882121.56826: variable 'ansible_shell_type' from source: unknown 7557 1726882121.56829: variable 'ansible_shell_executable' from source: unknown 7557 1726882121.56831: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.56833: variable 'ansible_pipelining' from source: unknown 7557 1726882121.56835: variable 'ansible_timeout' from source: unknown 7557 1726882121.56837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.56935: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882121.56945: variable 'omit' from source: magic vars 7557 1726882121.56957: starting attempt loop 7557 1726882121.56959: running the handler 7557 1726882121.56967: _low_level_execute_command(): starting 7557 1726882121.56974: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882121.57478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.57482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.57486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882121.57488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.57541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882121.57544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882121.57547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882121.57612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.59264: stdout chunk (state=3): >>>/root <<< 7557 1726882121.59361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882121.59386: stderr chunk (state=3): >>><<< 7557 1726882121.59389: stdout chunk (state=3): >>><<< 7557 1726882121.59415: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882121.59429: _low_level_execute_command(): starting 7557 1726882121.59432: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887 `" && echo ansible-tmp-1726882121.5941398-9387-156320432099887="` echo /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887 `" ) && sleep 0' 7557 1726882121.59853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.59858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.59869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882121.59871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882121.59873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.59916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882121.59920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882121.59969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.61835: stdout chunk (state=3): >>>ansible-tmp-1726882121.5941398-9387-156320432099887=/root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887 <<< 7557 1726882121.61941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882121.61963: stderr chunk (state=3): >>><<< 7557 1726882121.61966: stdout chunk (state=3): >>><<< 7557 1726882121.61979: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882121.5941398-9387-156320432099887=/root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882121.62006: variable 'ansible_module_compression' from source: unknown 7557 1726882121.62047: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882121.62075: variable 'ansible_facts' from source: unknown 7557 1726882121.62137: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/AnsiballZ_command.py 7557 1726882121.62227: Sending initial data 7557 1726882121.62231: Sent initial data (154 bytes) 7557 1726882121.62669: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882121.62672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882121.62674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882121.62677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882121.62680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.62731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882121.62734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882121.62783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.64288: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 7557 1726882121.64292: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882121.64328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882121.64373: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsrn31v1b /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/AnsiballZ_command.py <<< 7557 1726882121.64379: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/AnsiballZ_command.py" <<< 7557 1726882121.64431: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpsrn31v1b" to remote "/root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/AnsiballZ_command.py" <<< 7557 1726882121.65022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882121.65060: stderr chunk (state=3): >>><<< 7557 1726882121.65064: stdout chunk (state=3): >>><<< 7557 1726882121.65092: done transferring module to remote 7557 1726882121.65106: _low_level_execute_command(): starting 7557 1726882121.65109: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/ /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/AnsiballZ_command.py && sleep 0' 7557 1726882121.65608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.65611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 7557 1726882121.65614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7557 1726882121.65632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.65666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.65670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882121.65683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882121.65736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.67437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882121.67457: stderr chunk (state=3): >>><<< 7557 1726882121.67460: stdout chunk (state=3): >>><<< 7557 1726882121.67472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882121.67477: _low_level_execute_command(): starting 7557 1726882121.67480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/AnsiballZ_command.py && sleep 0' 7557 1726882121.67882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.67885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.67888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882121.67890: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.67892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.67944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882121.67949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882121.68000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.83147: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:28:41.826465", "end": "2024-09-20 21:28:41.829631", "delta": "0:00:00.003166", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882121.84702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882121.84706: stdout chunk (state=3): >>><<< 7557 1726882121.84709: stderr chunk (state=3): >>><<< 7557 1726882121.84712: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:28:41.826465", "end": "2024-09-20 21:28:41.829631", "delta": "0:00:00.003166", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882121.84714: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882121.84717: _low_level_execute_command(): starting 7557 1726882121.84720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882121.5941398-9387-156320432099887/ > /dev/null 2>&1 && sleep 0' 7557 1726882121.85349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882121.85358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882121.85370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882121.85384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882121.85500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882121.85508: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882121.85510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.85513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882121.85515: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882121.85521: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882121.85524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882121.85545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882121.85566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882121.85584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882121.85663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882121.87609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882121.87613: stdout chunk (state=3): >>><<< 7557 1726882121.87621: stderr chunk (state=3): >>><<< 7557 1726882121.87638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882121.87645: handler run complete 7557 1726882121.88014: Evaluated conditional (False): False 7557 1726882121.88017: attempt loop complete, returning result 7557 1726882121.88021: _execute() done 7557 1726882121.88023: dumping result to json 7557 1726882121.88025: done dumping result, returning 7557 1726882121.88027: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-ed48-b3a5-000000001d53] 7557 1726882121.88029: sending task result for task 12673a56-9f93-ed48-b3a5-000000001d53 ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003166", "end": "2024-09-20 21:28:41.829631", "rc": 0, "start": "2024-09-20 21:28:41.826465" } STDOUT: eth0 lo peerveth0 veth0 7557 1726882121.88284: no more pending results, returning what we have 7557 1726882121.88289: results queue empty 7557 1726882121.88291: checking for any_errors_fatal 7557 1726882121.88400: done checking for any_errors_fatal 7557 1726882121.88401: checking for max_fail_percentage 7557 1726882121.88403: done checking for max_fail_percentage 7557 1726882121.88409: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.88410: done checking to see if all hosts have failed 7557 1726882121.88410: getting the remaining hosts for this loop 7557 1726882121.88412: done getting the remaining hosts for this loop 7557 1726882121.88416: getting the next task for host managed_node3 7557 1726882121.88424: done getting next task for host managed_node3 7557 1726882121.88427: ^ task is: TASK: Set current_interfaces 7557 1726882121.88432: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.88437: getting variables 7557 1726882121.88438: in VariableManager get_vars() 7557 1726882121.88490: Calling all_inventory to load vars for managed_node3 7557 1726882121.88599: Calling groups_inventory to load vars for managed_node3 7557 1726882121.88616: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.88628: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.88632: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.88637: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.89322: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001d53 7557 1726882121.89326: WORKER PROCESS EXITING 7557 1726882121.91071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.92670: done with get_vars() 7557 1726882121.92702: done getting variables 7557 1726882121.92763: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:28:41 -0400 (0:00:00.372) 0:00:47.780 ****** 7557 1726882121.92801: entering _queue_task() for managed_node3/set_fact 7557 1726882121.93162: worker is 1 (out of 1 available) 7557 1726882121.93175: exiting _queue_task() for managed_node3/set_fact 7557 1726882121.93188: done queuing things up, now waiting for results queue to drain 7557 1726882121.93190: waiting for pending results... 7557 1726882121.93485: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7557 1726882121.93638: in run() - task 12673a56-9f93-ed48-b3a5-000000001d54 7557 1726882121.93658: variable 'ansible_search_path' from source: unknown 7557 1726882121.93665: variable 'ansible_search_path' from source: unknown 7557 1726882121.93709: calling self._execute() 7557 1726882121.93822: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.93837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.93850: variable 'omit' from source: magic vars 7557 1726882121.94239: variable 'ansible_distribution_major_version' from source: facts 7557 1726882121.94256: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882121.94270: variable 'omit' from source: magic vars 7557 1726882121.94329: variable 'omit' from source: magic vars 7557 1726882121.94444: variable '_current_interfaces' from source: set_fact 7557 1726882121.94521: variable 'omit' from source: magic vars 7557 1726882121.94598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882121.94611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882121.94634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882121.94656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882121.94673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882121.94810: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882121.94813: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.94816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.94842: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882121.94855: Set connection var ansible_shell_executable to /bin/sh 7557 1726882121.94862: Set connection var ansible_shell_type to sh 7557 1726882121.94871: Set connection var ansible_pipelining to False 7557 1726882121.94878: Set connection var ansible_connection to ssh 7557 1726882121.94887: Set connection var ansible_timeout to 10 7557 1726882121.94921: variable 'ansible_shell_executable' from source: unknown 7557 1726882121.94931: variable 'ansible_connection' from source: unknown 7557 1726882121.94939: variable 'ansible_module_compression' from source: unknown 7557 1726882121.94946: variable 'ansible_shell_type' from source: unknown 7557 1726882121.94953: variable 'ansible_shell_executable' from source: unknown 7557 1726882121.94966: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882121.94981: variable 'ansible_pipelining' from source: unknown 7557 1726882121.94995: variable 'ansible_timeout' from source: unknown 7557 1726882121.95008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882121.95170: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882121.95184: variable 'omit' from source: magic vars 7557 1726882121.95196: starting attempt loop 7557 1726882121.95204: running the handler 7557 1726882121.95244: handler run complete 7557 1726882121.95246: attempt loop complete, returning result 7557 1726882121.95249: _execute() done 7557 1726882121.95250: dumping result to json 7557 1726882121.95252: done dumping result, returning 7557 1726882121.95254: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-ed48-b3a5-000000001d54] 7557 1726882121.95262: sending task result for task 12673a56-9f93-ed48-b3a5-000000001d54 7557 1726882121.95500: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001d54 7557 1726882121.95505: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7557 1726882121.95571: no more pending results, returning what we have 7557 1726882121.95575: results queue empty 7557 1726882121.95577: checking for any_errors_fatal 7557 1726882121.95588: done checking for any_errors_fatal 7557 1726882121.95589: checking for max_fail_percentage 7557 1726882121.95590: done checking for max_fail_percentage 7557 1726882121.95592: checking to see if all hosts have failed and the running result is not ok 7557 1726882121.95596: done checking to see if all hosts have failed 7557 1726882121.95597: getting the remaining hosts for this loop 7557 1726882121.95599: done getting the remaining hosts for this loop 7557 1726882121.95603: getting the next task for host managed_node3 7557 1726882121.95613: done getting next task for host managed_node3 7557 1726882121.95615: ^ task is: TASK: Show current_interfaces 7557 1726882121.95622: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882121.95627: getting variables 7557 1726882121.95629: in VariableManager get_vars() 7557 1726882121.95687: Calling all_inventory to load vars for managed_node3 7557 1726882121.95690: Calling groups_inventory to load vars for managed_node3 7557 1726882121.95926: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882121.95938: Calling all_plugins_play to load vars for managed_node3 7557 1726882121.95941: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882121.95944: Calling groups_plugins_play to load vars for managed_node3 7557 1726882121.97266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882121.99518: done with get_vars() 7557 1726882121.99540: done getting variables 7557 1726882121.99710: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:28:41 -0400 (0:00:00.069) 0:00:47.850 ****** 7557 1726882121.99752: entering _queue_task() for managed_node3/debug 7557 1726882122.00129: worker is 1 (out of 1 available) 7557 1726882122.00140: exiting _queue_task() for managed_node3/debug 7557 1726882122.00153: done queuing things up, now waiting for results queue to drain 7557 1726882122.00154: waiting for pending results... 7557 1726882122.00452: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7557 1726882122.00620: in run() - task 12673a56-9f93-ed48-b3a5-000000001d1d 7557 1726882122.00624: variable 'ansible_search_path' from source: unknown 7557 1726882122.00628: variable 'ansible_search_path' from source: unknown 7557 1726882122.00658: calling self._execute() 7557 1726882122.01000: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.01004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.01007: variable 'omit' from source: magic vars 7557 1726882122.01170: variable 'ansible_distribution_major_version' from source: facts 7557 1726882122.01188: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882122.01205: variable 'omit' from source: magic vars 7557 1726882122.01258: variable 'omit' from source: magic vars 7557 1726882122.01364: variable 'current_interfaces' from source: set_fact 7557 1726882122.01402: variable 'omit' from source: magic vars 7557 1726882122.01452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882122.01496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882122.01522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882122.01544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882122.01572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882122.01616: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882122.01631: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.01639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.02012: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882122.02100: Set connection var ansible_shell_executable to /bin/sh 7557 1726882122.02103: Set connection var ansible_shell_type to sh 7557 1726882122.02106: Set connection var ansible_pipelining to False 7557 1726882122.02108: Set connection var ansible_connection to ssh 7557 1726882122.02110: Set connection var ansible_timeout to 10 7557 1726882122.02112: variable 'ansible_shell_executable' from source: unknown 7557 1726882122.02114: variable 'ansible_connection' from source: unknown 7557 1726882122.02117: variable 'ansible_module_compression' from source: unknown 7557 1726882122.02119: variable 'ansible_shell_type' from source: unknown 7557 1726882122.02121: variable 'ansible_shell_executable' from source: unknown 7557 1726882122.02123: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.02125: variable 'ansible_pipelining' from source: unknown 7557 1726882122.02127: variable 'ansible_timeout' from source: unknown 7557 1726882122.02129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.02446: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882122.02462: variable 'omit' from source: magic vars 7557 1726882122.02751: starting attempt loop 7557 1726882122.02754: running the handler 7557 1726882122.02756: handler run complete 7557 1726882122.02758: attempt loop complete, returning result 7557 1726882122.02760: _execute() done 7557 1726882122.02762: dumping result to json 7557 1726882122.02764: done dumping result, returning 7557 1726882122.02766: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-ed48-b3a5-000000001d1d] 7557 1726882122.02768: sending task result for task 12673a56-9f93-ed48-b3a5-000000001d1d ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7557 1726882122.02907: no more pending results, returning what we have 7557 1726882122.02911: results queue empty 7557 1726882122.02912: checking for any_errors_fatal 7557 1726882122.02919: done checking for any_errors_fatal 7557 1726882122.02920: checking for max_fail_percentage 7557 1726882122.02922: done checking for max_fail_percentage 7557 1726882122.02923: checking to see if all hosts have failed and the running result is not ok 7557 1726882122.02924: done checking to see if all hosts have failed 7557 1726882122.02925: getting the remaining hosts for this loop 7557 1726882122.02927: done getting the remaining hosts for this loop 7557 1726882122.02931: getting the next task for host managed_node3 7557 1726882122.02939: done getting next task for host managed_node3 7557 1726882122.02943: ^ task is: TASK: Install iproute 7557 1726882122.02947: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882122.02952: getting variables 7557 1726882122.02954: in VariableManager get_vars() 7557 1726882122.03010: Calling all_inventory to load vars for managed_node3 7557 1726882122.03014: Calling groups_inventory to load vars for managed_node3 7557 1726882122.03016: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882122.03027: Calling all_plugins_play to load vars for managed_node3 7557 1726882122.03030: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882122.03032: Calling groups_plugins_play to load vars for managed_node3 7557 1726882122.04022: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001d1d 7557 1726882122.04025: WORKER PROCESS EXITING 7557 1726882122.06102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882122.08347: done with get_vars() 7557 1726882122.08379: done getting variables 7557 1726882122.08441: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:28:42 -0400 (0:00:00.087) 0:00:47.937 ****** 7557 1726882122.08475: entering _queue_task() for managed_node3/package 7557 1726882122.09238: worker is 1 (out of 1 available) 7557 1726882122.09253: exiting _queue_task() for managed_node3/package 7557 1726882122.09269: done queuing things up, now waiting for results queue to drain 7557 1726882122.09270: waiting for pending results... 7557 1726882122.09712: running TaskExecutor() for managed_node3/TASK: Install iproute 7557 1726882122.10029: in run() - task 12673a56-9f93-ed48-b3a5-000000001a72 7557 1726882122.10047: variable 'ansible_search_path' from source: unknown 7557 1726882122.10051: variable 'ansible_search_path' from source: unknown 7557 1726882122.10089: calling self._execute() 7557 1726882122.10480: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.10484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.10487: variable 'omit' from source: magic vars 7557 1726882122.11265: variable 'ansible_distribution_major_version' from source: facts 7557 1726882122.11277: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882122.11404: variable 'omit' from source: magic vars 7557 1726882122.11448: variable 'omit' from source: magic vars 7557 1726882122.11855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7557 1726882122.14645: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7557 1726882122.14650: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7557 1726882122.14688: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7557 1726882122.14725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7557 1726882122.14752: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7557 1726882122.14850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7557 1726882122.15200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7557 1726882122.15204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7557 1726882122.15206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7557 1726882122.15208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7557 1726882122.15211: variable '__network_is_ostree' from source: set_fact 7557 1726882122.15213: variable 'omit' from source: magic vars 7557 1726882122.15215: variable 'omit' from source: magic vars 7557 1726882122.15217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882122.15220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882122.15223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882122.15308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882122.15311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882122.15314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882122.15316: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.15318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.15394: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882122.15404: Set connection var ansible_shell_executable to /bin/sh 7557 1726882122.15413: Set connection var ansible_shell_type to sh 7557 1726882122.15422: Set connection var ansible_pipelining to False 7557 1726882122.15425: Set connection var ansible_connection to ssh 7557 1726882122.15431: Set connection var ansible_timeout to 10 7557 1726882122.15454: variable 'ansible_shell_executable' from source: unknown 7557 1726882122.15457: variable 'ansible_connection' from source: unknown 7557 1726882122.15460: variable 'ansible_module_compression' from source: unknown 7557 1726882122.15462: variable 'ansible_shell_type' from source: unknown 7557 1726882122.15464: variable 'ansible_shell_executable' from source: unknown 7557 1726882122.15466: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.15471: variable 'ansible_pipelining' from source: unknown 7557 1726882122.15474: variable 'ansible_timeout' from source: unknown 7557 1726882122.15478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.15744: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882122.15748: variable 'omit' from source: magic vars 7557 1726882122.15750: starting attempt loop 7557 1726882122.15752: running the handler 7557 1726882122.15755: variable 'ansible_facts' from source: unknown 7557 1726882122.15757: variable 'ansible_facts' from source: unknown 7557 1726882122.15758: _low_level_execute_command(): starting 7557 1726882122.15760: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882122.16373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882122.16600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882122.16608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882122.16611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882122.18266: stdout chunk (state=3): >>>/root <<< 7557 1726882122.18460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882122.18470: stdout chunk (state=3): >>><<< 7557 1726882122.18479: stderr chunk (state=3): >>><<< 7557 1726882122.18720: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882122.18732: _low_level_execute_command(): starting 7557 1726882122.18738: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246 `" && echo ansible-tmp-1726882122.187195-9409-262738172116246="` echo /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246 `" ) && sleep 0' 7557 1726882122.19858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882122.19971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882122.20108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882122.20140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882122.20255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882122.22124: stdout chunk (state=3): >>>ansible-tmp-1726882122.187195-9409-262738172116246=/root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246 <<< 7557 1726882122.22388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882122.22404: stderr chunk (state=3): >>><<< 7557 1726882122.22431: stdout chunk (state=3): >>><<< 7557 1726882122.22478: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882122.187195-9409-262738172116246=/root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882122.22526: variable 'ansible_module_compression' from source: unknown 7557 1726882122.22616: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7557 1726882122.22686: variable 'ansible_facts' from source: unknown 7557 1726882122.22849: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/AnsiballZ_dnf.py 7557 1726882122.23018: Sending initial data 7557 1726882122.23124: Sent initial data (149 bytes) 7557 1726882122.23821: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882122.23835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882122.23912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882122.23934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882122.23955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882122.23972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882122.24052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882122.25614: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882122.25670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882122.25731: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp22j37_1n /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/AnsiballZ_dnf.py <<< 7557 1726882122.25734: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/AnsiballZ_dnf.py" <<< 7557 1726882122.25776: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmp22j37_1n" to remote "/root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/AnsiballZ_dnf.py" <<< 7557 1726882122.26764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882122.26792: stderr chunk (state=3): >>><<< 7557 1726882122.26910: stdout chunk (state=3): >>><<< 7557 1726882122.26913: done transferring module to remote 7557 1726882122.26915: _low_level_execute_command(): starting 7557 1726882122.26917: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/ /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/AnsiballZ_dnf.py && sleep 0' 7557 1726882122.27470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882122.27483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882122.27540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882122.27555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882122.27613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882122.27658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882122.27676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882122.27700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882122.28016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882122.29771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882122.29774: stdout chunk (state=3): >>><<< 7557 1726882122.29777: stderr chunk (state=3): >>><<< 7557 1726882122.29779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882122.29781: _low_level_execute_command(): starting 7557 1726882122.29784: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/AnsiballZ_dnf.py && sleep 0' 7557 1726882122.30289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882122.30307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882122.30321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882122.30337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882122.30353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882122.30412: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882122.30466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882122.30487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882122.30506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882122.30603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882122.70801: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7557 1726882122.74919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882122.74923: stdout chunk (state=3): >>><<< 7557 1726882122.74925: stderr chunk (state=3): >>><<< 7557 1726882122.74929: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882122.74931: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882122.74938: _low_level_execute_command(): starting 7557 1726882122.74940: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882122.187195-9409-262738172116246/ > /dev/null 2>&1 && sleep 0' 7557 1726882122.75674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882122.75682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882122.75698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882122.75786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882122.75866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882122.75870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882122.75872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882122.75917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882122.77908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882122.77911: stdout chunk (state=3): >>><<< 7557 1726882122.77914: stderr chunk (state=3): >>><<< 7557 1726882122.77917: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882122.77919: handler run complete 7557 1726882122.78077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7557 1726882122.78547: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7557 1726882122.78600: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7557 1726882122.78638: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7557 1726882122.79065: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7557 1726882122.79152: variable '__install_status' from source: set_fact 7557 1726882122.79245: Evaluated conditional (__install_status is success): True 7557 1726882122.79248: attempt loop complete, returning result 7557 1726882122.79250: _execute() done 7557 1726882122.79252: dumping result to json 7557 1726882122.79254: done dumping result, returning 7557 1726882122.79256: done running TaskExecutor() for managed_node3/TASK: Install iproute [12673a56-9f93-ed48-b3a5-000000001a72] 7557 1726882122.79258: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a72 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7557 1726882122.79681: no more pending results, returning what we have 7557 1726882122.79685: results queue empty 7557 1726882122.79686: checking for any_errors_fatal 7557 1726882122.79692: done checking for any_errors_fatal 7557 1726882122.79697: checking for max_fail_percentage 7557 1726882122.79699: done checking for max_fail_percentage 7557 1726882122.79700: checking to see if all hosts have failed and the running result is not ok 7557 1726882122.79701: done checking to see if all hosts have failed 7557 1726882122.79702: getting the remaining hosts for this loop 7557 1726882122.79703: done getting the remaining hosts for this loop 7557 1726882122.79707: getting the next task for host managed_node3 7557 1726882122.79713: done getting next task for host managed_node3 7557 1726882122.79717: ^ task is: TASK: Create veth interface {{ interface }} 7557 1726882122.79720: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882122.79724: getting variables 7557 1726882122.79726: in VariableManager get_vars() 7557 1726882122.79776: Calling all_inventory to load vars for managed_node3 7557 1726882122.79779: Calling groups_inventory to load vars for managed_node3 7557 1726882122.79781: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882122.79791: Calling all_plugins_play to load vars for managed_node3 7557 1726882122.79802: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882122.79808: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a72 7557 1726882122.79811: WORKER PROCESS EXITING 7557 1726882122.79815: Calling groups_plugins_play to load vars for managed_node3 7557 1726882122.81698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882122.84490: done with get_vars() 7557 1726882122.84526: done getting variables 7557 1726882122.84703: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882122.84940: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:28:42 -0400 (0:00:00.764) 0:00:48.702 ****** 7557 1726882122.84973: entering _queue_task() for managed_node3/command 7557 1726882122.85717: worker is 1 (out of 1 available) 7557 1726882122.85732: exiting _queue_task() for managed_node3/command 7557 1726882122.85745: done queuing things up, now waiting for results queue to drain 7557 1726882122.85746: waiting for pending results... 7557 1726882122.86125: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7557 1726882122.86436: in run() - task 12673a56-9f93-ed48-b3a5-000000001a73 7557 1726882122.86450: variable 'ansible_search_path' from source: unknown 7557 1726882122.86459: variable 'ansible_search_path' from source: unknown 7557 1726882122.86945: variable 'interface' from source: play vars 7557 1726882122.87298: variable 'interface' from source: play vars 7557 1726882122.87302: variable 'interface' from source: play vars 7557 1726882122.87665: Loaded config def from plugin (lookup/items) 7557 1726882122.87669: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7557 1726882122.87689: variable 'omit' from source: magic vars 7557 1726882122.88042: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.88053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.88065: variable 'omit' from source: magic vars 7557 1726882122.88491: variable 'ansible_distribution_major_version' from source: facts 7557 1726882122.88503: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882122.88885: variable 'type' from source: play vars 7557 1726882122.88889: variable 'state' from source: include params 7557 1726882122.88975: variable 'interface' from source: play vars 7557 1726882122.88978: variable 'current_interfaces' from source: set_fact 7557 1726882122.88981: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7557 1726882122.88983: when evaluation is False, skipping this task 7557 1726882122.88986: variable 'item' from source: unknown 7557 1726882122.89203: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7557 1726882122.89474: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.89477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.89480: variable 'omit' from source: magic vars 7557 1726882122.89903: variable 'ansible_distribution_major_version' from source: facts 7557 1726882122.89906: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882122.90058: variable 'type' from source: play vars 7557 1726882122.90061: variable 'state' from source: include params 7557 1726882122.90066: variable 'interface' from source: play vars 7557 1726882122.90069: variable 'current_interfaces' from source: set_fact 7557 1726882122.90076: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7557 1726882122.90079: when evaluation is False, skipping this task 7557 1726882122.90118: variable 'item' from source: unknown 7557 1726882122.90171: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7557 1726882122.90449: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.90478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.90481: variable 'omit' from source: magic vars 7557 1726882122.90817: variable 'ansible_distribution_major_version' from source: facts 7557 1726882122.90881: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882122.91208: variable 'type' from source: play vars 7557 1726882122.91212: variable 'state' from source: include params 7557 1726882122.91217: variable 'interface' from source: play vars 7557 1726882122.91220: variable 'current_interfaces' from source: set_fact 7557 1726882122.91230: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7557 1726882122.91236: when evaluation is False, skipping this task 7557 1726882122.91258: variable 'item' from source: unknown 7557 1726882122.91321: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7557 1726882122.91407: dumping result to json 7557 1726882122.91410: done dumping result, returning 7557 1726882122.91413: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [12673a56-9f93-ed48-b3a5-000000001a73] 7557 1726882122.91415: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a73 skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7557 1726882122.91578: no more pending results, returning what we have 7557 1726882122.91582: results queue empty 7557 1726882122.91583: checking for any_errors_fatal 7557 1726882122.91595: done checking for any_errors_fatal 7557 1726882122.91596: checking for max_fail_percentage 7557 1726882122.91598: done checking for max_fail_percentage 7557 1726882122.91599: checking to see if all hosts have failed and the running result is not ok 7557 1726882122.91600: done checking to see if all hosts have failed 7557 1726882122.91600: getting the remaining hosts for this loop 7557 1726882122.91602: done getting the remaining hosts for this loop 7557 1726882122.91605: getting the next task for host managed_node3 7557 1726882122.91612: done getting next task for host managed_node3 7557 1726882122.91614: ^ task is: TASK: Set up veth as managed by NetworkManager 7557 1726882122.91618: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882122.91622: getting variables 7557 1726882122.91623: in VariableManager get_vars() 7557 1726882122.91675: Calling all_inventory to load vars for managed_node3 7557 1726882122.91678: Calling groups_inventory to load vars for managed_node3 7557 1726882122.91680: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882122.91691: Calling all_plugins_play to load vars for managed_node3 7557 1726882122.91809: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882122.91814: Calling groups_plugins_play to load vars for managed_node3 7557 1726882122.92610: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a73 7557 1726882122.92613: WORKER PROCESS EXITING 7557 1726882122.94584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882122.97175: done with get_vars() 7557 1726882122.97210: done getting variables 7557 1726882122.97269: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:28:42 -0400 (0:00:00.124) 0:00:48.827 ****** 7557 1726882122.97422: entering _queue_task() for managed_node3/command 7557 1726882122.98433: worker is 1 (out of 1 available) 7557 1726882122.98445: exiting _queue_task() for managed_node3/command 7557 1726882122.98457: done queuing things up, now waiting for results queue to drain 7557 1726882122.98459: waiting for pending results... 7557 1726882122.98801: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7557 1726882122.99113: in run() - task 12673a56-9f93-ed48-b3a5-000000001a74 7557 1726882122.99127: variable 'ansible_search_path' from source: unknown 7557 1726882122.99231: variable 'ansible_search_path' from source: unknown 7557 1726882122.99235: calling self._execute() 7557 1726882122.99366: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882122.99374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882122.99385: variable 'omit' from source: magic vars 7557 1726882123.00336: variable 'ansible_distribution_major_version' from source: facts 7557 1726882123.00340: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882123.00724: variable 'type' from source: play vars 7557 1726882123.00730: variable 'state' from source: include params 7557 1726882123.00735: Evaluated conditional (type == 'veth' and state == 'present'): False 7557 1726882123.00739: when evaluation is False, skipping this task 7557 1726882123.00741: _execute() done 7557 1726882123.00744: dumping result to json 7557 1726882123.00746: done dumping result, returning 7557 1726882123.00754: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-ed48-b3a5-000000001a74] 7557 1726882123.00759: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a74 7557 1726882123.00856: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a74 7557 1726882123.00859: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7557 1726882123.00927: no more pending results, returning what we have 7557 1726882123.00931: results queue empty 7557 1726882123.00933: checking for any_errors_fatal 7557 1726882123.00947: done checking for any_errors_fatal 7557 1726882123.00948: checking for max_fail_percentage 7557 1726882123.00950: done checking for max_fail_percentage 7557 1726882123.00951: checking to see if all hosts have failed and the running result is not ok 7557 1726882123.00952: done checking to see if all hosts have failed 7557 1726882123.00953: getting the remaining hosts for this loop 7557 1726882123.00955: done getting the remaining hosts for this loop 7557 1726882123.00959: getting the next task for host managed_node3 7557 1726882123.00966: done getting next task for host managed_node3 7557 1726882123.00969: ^ task is: TASK: Delete veth interface {{ interface }} 7557 1726882123.00973: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882123.00979: getting variables 7557 1726882123.00981: in VariableManager get_vars() 7557 1726882123.01039: Calling all_inventory to load vars for managed_node3 7557 1726882123.01042: Calling groups_inventory to load vars for managed_node3 7557 1726882123.01044: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882123.01056: Calling all_plugins_play to load vars for managed_node3 7557 1726882123.01058: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882123.01061: Calling groups_plugins_play to load vars for managed_node3 7557 1726882123.04321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882123.07516: done with get_vars() 7557 1726882123.07547: done getting variables 7557 1726882123.07673: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882123.07901: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:28:43 -0400 (0:00:00.105) 0:00:48.933 ****** 7557 1726882123.08023: entering _queue_task() for managed_node3/command 7557 1726882123.08643: worker is 1 (out of 1 available) 7557 1726882123.08657: exiting _queue_task() for managed_node3/command 7557 1726882123.08798: done queuing things up, now waiting for results queue to drain 7557 1726882123.08800: waiting for pending results... 7557 1726882123.09357: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7557 1726882123.09376: in run() - task 12673a56-9f93-ed48-b3a5-000000001a75 7557 1726882123.09392: variable 'ansible_search_path' from source: unknown 7557 1726882123.09397: variable 'ansible_search_path' from source: unknown 7557 1726882123.09698: calling self._execute() 7557 1726882123.09901: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.09905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.09908: variable 'omit' from source: magic vars 7557 1726882123.10606: variable 'ansible_distribution_major_version' from source: facts 7557 1726882123.10623: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882123.11030: variable 'type' from source: play vars 7557 1726882123.11041: variable 'state' from source: include params 7557 1726882123.11049: variable 'interface' from source: play vars 7557 1726882123.11056: variable 'current_interfaces' from source: set_fact 7557 1726882123.11069: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7557 1726882123.11500: variable 'omit' from source: magic vars 7557 1726882123.11503: variable 'omit' from source: magic vars 7557 1726882123.11506: variable 'interface' from source: play vars 7557 1726882123.11508: variable 'omit' from source: magic vars 7557 1726882123.11519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882123.11559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882123.11999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882123.12003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882123.12005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882123.12008: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882123.12010: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.12012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.12014: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882123.12016: Set connection var ansible_shell_executable to /bin/sh 7557 1726882123.12021: Set connection var ansible_shell_type to sh 7557 1726882123.12031: Set connection var ansible_pipelining to False 7557 1726882123.12038: Set connection var ansible_connection to ssh 7557 1726882123.12047: Set connection var ansible_timeout to 10 7557 1726882123.12499: variable 'ansible_shell_executable' from source: unknown 7557 1726882123.12502: variable 'ansible_connection' from source: unknown 7557 1726882123.12505: variable 'ansible_module_compression' from source: unknown 7557 1726882123.12507: variable 'ansible_shell_type' from source: unknown 7557 1726882123.12509: variable 'ansible_shell_executable' from source: unknown 7557 1726882123.12510: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.12512: variable 'ansible_pipelining' from source: unknown 7557 1726882123.12514: variable 'ansible_timeout' from source: unknown 7557 1726882123.12515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.12518: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882123.12520: variable 'omit' from source: magic vars 7557 1726882123.12523: starting attempt loop 7557 1726882123.12526: running the handler 7557 1726882123.12529: _low_level_execute_command(): starting 7557 1726882123.12706: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882123.14030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882123.14116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882123.14223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882123.14236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882123.14322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882123.16002: stdout chunk (state=3): >>>/root <<< 7557 1726882123.16082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882123.16120: stderr chunk (state=3): >>><<< 7557 1726882123.16130: stdout chunk (state=3): >>><<< 7557 1726882123.16421: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882123.16425: _low_level_execute_command(): starting 7557 1726882123.16428: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514 `" && echo ansible-tmp-1726882123.1631908-9445-47678353388514="` echo /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514 `" ) && sleep 0' 7557 1726882123.17580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882123.17622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882123.17665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882123.17885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882123.17978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882123.19874: stdout chunk (state=3): >>>ansible-tmp-1726882123.1631908-9445-47678353388514=/root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514 <<< 7557 1726882123.19975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882123.20018: stderr chunk (state=3): >>><<< 7557 1726882123.20030: stdout chunk (state=3): >>><<< 7557 1726882123.20114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882123.1631908-9445-47678353388514=/root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882123.20151: variable 'ansible_module_compression' from source: unknown 7557 1726882123.20500: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882123.20504: variable 'ansible_facts' from source: unknown 7557 1726882123.20547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/AnsiballZ_command.py 7557 1726882123.20816: Sending initial data 7557 1726882123.20827: Sent initial data (153 bytes) 7557 1726882123.21925: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882123.21939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882123.21953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882123.21962: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882123.22167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882123.22179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882123.22239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882123.23756: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 7557 1726882123.23789: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882123.23907: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882123.23983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpo_z3a77n /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/AnsiballZ_command.py <<< 7557 1726882123.23986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/AnsiballZ_command.py" <<< 7557 1726882123.24021: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpo_z3a77n" to remote "/root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/AnsiballZ_command.py" <<< 7557 1726882123.24112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/AnsiballZ_command.py" <<< 7557 1726882123.25280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882123.25401: stderr chunk (state=3): >>><<< 7557 1726882123.25406: stdout chunk (state=3): >>><<< 7557 1726882123.25409: done transferring module to remote 7557 1726882123.25411: _low_level_execute_command(): starting 7557 1726882123.25414: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/ /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/AnsiballZ_command.py && sleep 0' 7557 1726882123.26762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882123.26803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882123.26870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882123.26929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882123.27078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882123.28874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882123.28878: stdout chunk (state=3): >>><<< 7557 1726882123.28925: stderr chunk (state=3): >>><<< 7557 1726882123.29011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882123.29014: _low_level_execute_command(): starting 7557 1726882123.29017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/AnsiballZ_command.py && sleep 0' 7557 1726882123.29672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882123.29708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882123.29792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882123.29798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882123.29800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882123.29881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882123.46098: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:28:43.445909", "end": "2024-09-20 21:28:43.456183", "delta": "0:00:00.010274", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882123.48127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882123.48177: stderr chunk (state=3): >>><<< 7557 1726882123.48179: stdout chunk (state=3): >>><<< 7557 1726882123.48202: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:28:43.445909", "end": "2024-09-20 21:28:43.456183", "delta": "0:00:00.010274", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882123.48328: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882123.48331: _low_level_execute_command(): starting 7557 1726882123.48335: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882123.1631908-9445-47678353388514/ > /dev/null 2>&1 && sleep 0' 7557 1726882123.49011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882123.49038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882123.49063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882123.49075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882123.49161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882123.51104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882123.51108: stdout chunk (state=3): >>><<< 7557 1726882123.51110: stderr chunk (state=3): >>><<< 7557 1726882123.51113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882123.51115: handler run complete 7557 1726882123.51118: Evaluated conditional (False): False 7557 1726882123.51120: attempt loop complete, returning result 7557 1726882123.51122: _execute() done 7557 1726882123.51123: dumping result to json 7557 1726882123.51125: done dumping result, returning 7557 1726882123.51127: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [12673a56-9f93-ed48-b3a5-000000001a75] 7557 1726882123.51129: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a75 7557 1726882123.51198: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a75 7557 1726882123.51201: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.010274", "end": "2024-09-20 21:28:43.456183", "rc": 0, "start": "2024-09-20 21:28:43.445909" } 7557 1726882123.51454: no more pending results, returning what we have 7557 1726882123.51458: results queue empty 7557 1726882123.51459: checking for any_errors_fatal 7557 1726882123.51466: done checking for any_errors_fatal 7557 1726882123.51467: checking for max_fail_percentage 7557 1726882123.51468: done checking for max_fail_percentage 7557 1726882123.51469: checking to see if all hosts have failed and the running result is not ok 7557 1726882123.51470: done checking to see if all hosts have failed 7557 1726882123.51471: getting the remaining hosts for this loop 7557 1726882123.51472: done getting the remaining hosts for this loop 7557 1726882123.51475: getting the next task for host managed_node3 7557 1726882123.51482: done getting next task for host managed_node3 7557 1726882123.51484: ^ task is: TASK: Create dummy interface {{ interface }} 7557 1726882123.51487: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882123.51491: getting variables 7557 1726882123.51500: in VariableManager get_vars() 7557 1726882123.51550: Calling all_inventory to load vars for managed_node3 7557 1726882123.51553: Calling groups_inventory to load vars for managed_node3 7557 1726882123.51556: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882123.51566: Calling all_plugins_play to load vars for managed_node3 7557 1726882123.51568: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882123.51571: Calling groups_plugins_play to load vars for managed_node3 7557 1726882123.52948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882123.54538: done with get_vars() 7557 1726882123.54564: done getting variables 7557 1726882123.54627: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882123.54738: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:28:43 -0400 (0:00:00.467) 0:00:49.400 ****** 7557 1726882123.54771: entering _queue_task() for managed_node3/command 7557 1726882123.55115: worker is 1 (out of 1 available) 7557 1726882123.55128: exiting _queue_task() for managed_node3/command 7557 1726882123.55143: done queuing things up, now waiting for results queue to drain 7557 1726882123.55144: waiting for pending results... 7557 1726882123.55617: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7557 1726882123.55623: in run() - task 12673a56-9f93-ed48-b3a5-000000001a76 7557 1726882123.55627: variable 'ansible_search_path' from source: unknown 7557 1726882123.55630: variable 'ansible_search_path' from source: unknown 7557 1726882123.55661: calling self._execute() 7557 1726882123.55780: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.55817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.55825: variable 'omit' from source: magic vars 7557 1726882123.56223: variable 'ansible_distribution_major_version' from source: facts 7557 1726882123.56261: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882123.56463: variable 'type' from source: play vars 7557 1726882123.56478: variable 'state' from source: include params 7557 1726882123.56500: variable 'interface' from source: play vars 7557 1726882123.56503: variable 'current_interfaces' from source: set_fact 7557 1726882123.56590: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7557 1726882123.56598: when evaluation is False, skipping this task 7557 1726882123.56601: _execute() done 7557 1726882123.56603: dumping result to json 7557 1726882123.56606: done dumping result, returning 7557 1726882123.56608: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [12673a56-9f93-ed48-b3a5-000000001a76] 7557 1726882123.56611: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a76 7557 1726882123.56677: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a76 7557 1726882123.56680: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882123.56745: no more pending results, returning what we have 7557 1726882123.56750: results queue empty 7557 1726882123.56752: checking for any_errors_fatal 7557 1726882123.56763: done checking for any_errors_fatal 7557 1726882123.56764: checking for max_fail_percentage 7557 1726882123.56766: done checking for max_fail_percentage 7557 1726882123.56767: checking to see if all hosts have failed and the running result is not ok 7557 1726882123.56768: done checking to see if all hosts have failed 7557 1726882123.56768: getting the remaining hosts for this loop 7557 1726882123.56770: done getting the remaining hosts for this loop 7557 1726882123.56774: getting the next task for host managed_node3 7557 1726882123.56781: done getting next task for host managed_node3 7557 1726882123.56784: ^ task is: TASK: Delete dummy interface {{ interface }} 7557 1726882123.56789: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882123.56798: getting variables 7557 1726882123.56800: in VariableManager get_vars() 7557 1726882123.56861: Calling all_inventory to load vars for managed_node3 7557 1726882123.56865: Calling groups_inventory to load vars for managed_node3 7557 1726882123.56868: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882123.56881: Calling all_plugins_play to load vars for managed_node3 7557 1726882123.56885: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882123.56889: Calling groups_plugins_play to load vars for managed_node3 7557 1726882123.58640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882123.60482: done with get_vars() 7557 1726882123.60782: done getting variables 7557 1726882123.60849: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882123.60959: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:28:43 -0400 (0:00:00.062) 0:00:49.462 ****** 7557 1726882123.61139: entering _queue_task() for managed_node3/command 7557 1726882123.61558: worker is 1 (out of 1 available) 7557 1726882123.61572: exiting _queue_task() for managed_node3/command 7557 1726882123.61587: done queuing things up, now waiting for results queue to drain 7557 1726882123.61588: waiting for pending results... 7557 1726882123.61911: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7557 1726882123.62005: in run() - task 12673a56-9f93-ed48-b3a5-000000001a77 7557 1726882123.62031: variable 'ansible_search_path' from source: unknown 7557 1726882123.62067: variable 'ansible_search_path' from source: unknown 7557 1726882123.62095: calling self._execute() 7557 1726882123.62213: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.62226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.62287: variable 'omit' from source: magic vars 7557 1726882123.62659: variable 'ansible_distribution_major_version' from source: facts 7557 1726882123.62681: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882123.62890: variable 'type' from source: play vars 7557 1726882123.62909: variable 'state' from source: include params 7557 1726882123.62936: variable 'interface' from source: play vars 7557 1726882123.62939: variable 'current_interfaces' from source: set_fact 7557 1726882123.62942: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7557 1726882123.63099: when evaluation is False, skipping this task 7557 1726882123.63102: _execute() done 7557 1726882123.63105: dumping result to json 7557 1726882123.63108: done dumping result, returning 7557 1726882123.63111: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [12673a56-9f93-ed48-b3a5-000000001a77] 7557 1726882123.63113: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a77 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882123.63235: no more pending results, returning what we have 7557 1726882123.63239: results queue empty 7557 1726882123.63240: checking for any_errors_fatal 7557 1726882123.63248: done checking for any_errors_fatal 7557 1726882123.63249: checking for max_fail_percentage 7557 1726882123.63251: done checking for max_fail_percentage 7557 1726882123.63252: checking to see if all hosts have failed and the running result is not ok 7557 1726882123.63253: done checking to see if all hosts have failed 7557 1726882123.63253: getting the remaining hosts for this loop 7557 1726882123.63255: done getting the remaining hosts for this loop 7557 1726882123.63258: getting the next task for host managed_node3 7557 1726882123.63264: done getting next task for host managed_node3 7557 1726882123.63267: ^ task is: TASK: Create tap interface {{ interface }} 7557 1726882123.63270: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882123.63275: getting variables 7557 1726882123.63276: in VariableManager get_vars() 7557 1726882123.63330: Calling all_inventory to load vars for managed_node3 7557 1726882123.63333: Calling groups_inventory to load vars for managed_node3 7557 1726882123.63336: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882123.63349: Calling all_plugins_play to load vars for managed_node3 7557 1726882123.63352: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882123.63355: Calling groups_plugins_play to load vars for managed_node3 7557 1726882123.64307: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a77 7557 1726882123.64311: WORKER PROCESS EXITING 7557 1726882123.64956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882123.73233: done with get_vars() 7557 1726882123.73265: done getting variables 7557 1726882123.73524: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882123.73621: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:28:43 -0400 (0:00:00.126) 0:00:49.589 ****** 7557 1726882123.73647: entering _queue_task() for managed_node3/command 7557 1726882123.74397: worker is 1 (out of 1 available) 7557 1726882123.74411: exiting _queue_task() for managed_node3/command 7557 1726882123.74426: done queuing things up, now waiting for results queue to drain 7557 1726882123.74427: waiting for pending results... 7557 1726882123.74953: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7557 1726882123.75173: in run() - task 12673a56-9f93-ed48-b3a5-000000001a78 7557 1726882123.75185: variable 'ansible_search_path' from source: unknown 7557 1726882123.75189: variable 'ansible_search_path' from source: unknown 7557 1726882123.75334: calling self._execute() 7557 1726882123.75446: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.75564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.75578: variable 'omit' from source: magic vars 7557 1726882123.76423: variable 'ansible_distribution_major_version' from source: facts 7557 1726882123.76442: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882123.77016: variable 'type' from source: play vars 7557 1726882123.77020: variable 'state' from source: include params 7557 1726882123.77022: variable 'interface' from source: play vars 7557 1726882123.77025: variable 'current_interfaces' from source: set_fact 7557 1726882123.77028: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7557 1726882123.77299: when evaluation is False, skipping this task 7557 1726882123.77303: _execute() done 7557 1726882123.77306: dumping result to json 7557 1726882123.77308: done dumping result, returning 7557 1726882123.77310: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [12673a56-9f93-ed48-b3a5-000000001a78] 7557 1726882123.77312: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a78 7557 1726882123.77388: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a78 7557 1726882123.77452: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882123.77506: no more pending results, returning what we have 7557 1726882123.77510: results queue empty 7557 1726882123.77511: checking for any_errors_fatal 7557 1726882123.77521: done checking for any_errors_fatal 7557 1726882123.77521: checking for max_fail_percentage 7557 1726882123.77523: done checking for max_fail_percentage 7557 1726882123.77524: checking to see if all hosts have failed and the running result is not ok 7557 1726882123.77525: done checking to see if all hosts have failed 7557 1726882123.77525: getting the remaining hosts for this loop 7557 1726882123.77527: done getting the remaining hosts for this loop 7557 1726882123.77530: getting the next task for host managed_node3 7557 1726882123.77536: done getting next task for host managed_node3 7557 1726882123.77539: ^ task is: TASK: Delete tap interface {{ interface }} 7557 1726882123.77542: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882123.77547: getting variables 7557 1726882123.77548: in VariableManager get_vars() 7557 1726882123.77613: Calling all_inventory to load vars for managed_node3 7557 1726882123.77616: Calling groups_inventory to load vars for managed_node3 7557 1726882123.77620: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882123.77633: Calling all_plugins_play to load vars for managed_node3 7557 1726882123.77637: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882123.77640: Calling groups_plugins_play to load vars for managed_node3 7557 1726882123.79509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882123.81276: done with get_vars() 7557 1726882123.81311: done getting variables 7557 1726882123.81370: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 7557 1726882123.81487: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:28:43 -0400 (0:00:00.079) 0:00:49.668 ****** 7557 1726882123.81554: entering _queue_task() for managed_node3/command 7557 1726882123.82151: worker is 1 (out of 1 available) 7557 1726882123.82164: exiting _queue_task() for managed_node3/command 7557 1726882123.82177: done queuing things up, now waiting for results queue to drain 7557 1726882123.82178: waiting for pending results... 7557 1726882123.82746: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7557 1726882123.82796: in run() - task 12673a56-9f93-ed48-b3a5-000000001a79 7557 1726882123.82859: variable 'ansible_search_path' from source: unknown 7557 1726882123.82906: variable 'ansible_search_path' from source: unknown 7557 1726882123.83061: calling self._execute() 7557 1726882123.83241: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.83253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.83268: variable 'omit' from source: magic vars 7557 1726882123.83790: variable 'ansible_distribution_major_version' from source: facts 7557 1726882123.83812: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882123.84022: variable 'type' from source: play vars 7557 1726882123.84034: variable 'state' from source: include params 7557 1726882123.84049: variable 'interface' from source: play vars 7557 1726882123.84058: variable 'current_interfaces' from source: set_fact 7557 1726882123.84071: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7557 1726882123.84078: when evaluation is False, skipping this task 7557 1726882123.84085: _execute() done 7557 1726882123.84091: dumping result to json 7557 1726882123.84103: done dumping result, returning 7557 1726882123.84114: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [12673a56-9f93-ed48-b3a5-000000001a79] 7557 1726882123.84125: sending task result for task 12673a56-9f93-ed48-b3a5-000000001a79 7557 1726882123.84325: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001a79 7557 1726882123.84329: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7557 1726882123.84374: no more pending results, returning what we have 7557 1726882123.84379: results queue empty 7557 1726882123.84380: checking for any_errors_fatal 7557 1726882123.84386: done checking for any_errors_fatal 7557 1726882123.84387: checking for max_fail_percentage 7557 1726882123.84388: done checking for max_fail_percentage 7557 1726882123.84389: checking to see if all hosts have failed and the running result is not ok 7557 1726882123.84390: done checking to see if all hosts have failed 7557 1726882123.84390: getting the remaining hosts for this loop 7557 1726882123.84392: done getting the remaining hosts for this loop 7557 1726882123.84397: getting the next task for host managed_node3 7557 1726882123.84405: done getting next task for host managed_node3 7557 1726882123.84409: ^ task is: TASK: Verify network state restored to default 7557 1726882123.84412: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882123.84416: getting variables 7557 1726882123.84417: in VariableManager get_vars() 7557 1726882123.84470: Calling all_inventory to load vars for managed_node3 7557 1726882123.84473: Calling groups_inventory to load vars for managed_node3 7557 1726882123.84476: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882123.84487: Calling all_plugins_play to load vars for managed_node3 7557 1726882123.84490: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882123.84605: Calling groups_plugins_play to load vars for managed_node3 7557 1726882123.86768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882123.88480: done with get_vars() 7557 1726882123.88513: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:149 Friday 20 September 2024 21:28:43 -0400 (0:00:00.070) 0:00:49.739 ****** 7557 1726882123.88616: entering _queue_task() for managed_node3/include_tasks 7557 1726882123.89046: worker is 1 (out of 1 available) 7557 1726882123.89062: exiting _queue_task() for managed_node3/include_tasks 7557 1726882123.89077: done queuing things up, now waiting for results queue to drain 7557 1726882123.89079: waiting for pending results... 7557 1726882123.89514: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 7557 1726882123.89519: in run() - task 12673a56-9f93-ed48-b3a5-000000000151 7557 1726882123.89523: variable 'ansible_search_path' from source: unknown 7557 1726882123.89526: calling self._execute() 7557 1726882123.89639: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882123.89650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882123.89663: variable 'omit' from source: magic vars 7557 1726882123.90300: variable 'ansible_distribution_major_version' from source: facts 7557 1726882123.90303: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882123.90307: _execute() done 7557 1726882123.90309: dumping result to json 7557 1726882123.90311: done dumping result, returning 7557 1726882123.90313: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [12673a56-9f93-ed48-b3a5-000000000151] 7557 1726882123.90315: sending task result for task 12673a56-9f93-ed48-b3a5-000000000151 7557 1726882123.90381: done sending task result for task 12673a56-9f93-ed48-b3a5-000000000151 7557 1726882123.90385: WORKER PROCESS EXITING 7557 1726882123.90416: no more pending results, returning what we have 7557 1726882123.90422: in VariableManager get_vars() 7557 1726882123.90482: Calling all_inventory to load vars for managed_node3 7557 1726882123.90485: Calling groups_inventory to load vars for managed_node3 7557 1726882123.90488: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882123.90502: Calling all_plugins_play to load vars for managed_node3 7557 1726882123.90505: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882123.90508: Calling groups_plugins_play to load vars for managed_node3 7557 1726882123.92772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882123.94363: done with get_vars() 7557 1726882123.94385: variable 'ansible_search_path' from source: unknown 7557 1726882123.94407: we have included files to process 7557 1726882123.94409: generating all_blocks data 7557 1726882123.94410: done generating all_blocks data 7557 1726882123.94416: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7557 1726882123.94418: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7557 1726882123.94421: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7557 1726882123.94930: done processing included file 7557 1726882123.94932: iterating over new_blocks loaded from include file 7557 1726882123.94934: in VariableManager get_vars() 7557 1726882123.94964: done with get_vars() 7557 1726882123.94966: filtering new block on tags 7557 1726882123.94985: done filtering new block on tags 7557 1726882123.94987: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 7557 1726882123.94992: extending task lists for all hosts with included blocks 7557 1726882124.01427: done extending task lists 7557 1726882124.01429: done processing included files 7557 1726882124.01430: results queue empty 7557 1726882124.01431: checking for any_errors_fatal 7557 1726882124.01434: done checking for any_errors_fatal 7557 1726882124.01435: checking for max_fail_percentage 7557 1726882124.01436: done checking for max_fail_percentage 7557 1726882124.01436: checking to see if all hosts have failed and the running result is not ok 7557 1726882124.01437: done checking to see if all hosts have failed 7557 1726882124.01438: getting the remaining hosts for this loop 7557 1726882124.01439: done getting the remaining hosts for this loop 7557 1726882124.01441: getting the next task for host managed_node3 7557 1726882124.01444: done getting next task for host managed_node3 7557 1726882124.01447: ^ task is: TASK: Check routes and DNS 7557 1726882124.01449: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882124.01451: getting variables 7557 1726882124.01452: in VariableManager get_vars() 7557 1726882124.01477: Calling all_inventory to load vars for managed_node3 7557 1726882124.01479: Calling groups_inventory to load vars for managed_node3 7557 1726882124.01481: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882124.01487: Calling all_plugins_play to load vars for managed_node3 7557 1726882124.01489: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882124.01491: Calling groups_plugins_play to load vars for managed_node3 7557 1726882124.02707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882124.04555: done with get_vars() 7557 1726882124.04590: done getting variables 7557 1726882124.04640: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:28:44 -0400 (0:00:00.160) 0:00:49.899 ****** 7557 1726882124.04674: entering _queue_task() for managed_node3/shell 7557 1726882124.05157: worker is 1 (out of 1 available) 7557 1726882124.05170: exiting _queue_task() for managed_node3/shell 7557 1726882124.05182: done queuing things up, now waiting for results queue to drain 7557 1726882124.05183: waiting for pending results... 7557 1726882124.05399: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 7557 1726882124.05528: in run() - task 12673a56-9f93-ed48-b3a5-000000001d93 7557 1726882124.05533: variable 'ansible_search_path' from source: unknown 7557 1726882124.05535: variable 'ansible_search_path' from source: unknown 7557 1726882124.05700: calling self._execute() 7557 1726882124.05704: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882124.05707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882124.05710: variable 'omit' from source: magic vars 7557 1726882124.06177: variable 'ansible_distribution_major_version' from source: facts 7557 1726882124.06180: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882124.06183: variable 'omit' from source: magic vars 7557 1726882124.06185: variable 'omit' from source: magic vars 7557 1726882124.06210: variable 'omit' from source: magic vars 7557 1726882124.06407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882124.06411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882124.06414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882124.06416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882124.06419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882124.06422: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882124.06424: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882124.06427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882124.06660: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882124.06663: Set connection var ansible_shell_executable to /bin/sh 7557 1726882124.06666: Set connection var ansible_shell_type to sh 7557 1726882124.06669: Set connection var ansible_pipelining to False 7557 1726882124.06671: Set connection var ansible_connection to ssh 7557 1726882124.06673: Set connection var ansible_timeout to 10 7557 1726882124.06675: variable 'ansible_shell_executable' from source: unknown 7557 1726882124.06678: variable 'ansible_connection' from source: unknown 7557 1726882124.06681: variable 'ansible_module_compression' from source: unknown 7557 1726882124.06683: variable 'ansible_shell_type' from source: unknown 7557 1726882124.06685: variable 'ansible_shell_executable' from source: unknown 7557 1726882124.06687: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882124.06689: variable 'ansible_pipelining' from source: unknown 7557 1726882124.06692: variable 'ansible_timeout' from source: unknown 7557 1726882124.06700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882124.06871: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882124.06875: variable 'omit' from source: magic vars 7557 1726882124.06877: starting attempt loop 7557 1726882124.06880: running the handler 7557 1726882124.06882: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882124.06884: _low_level_execute_command(): starting 7557 1726882124.06886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882124.07796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882124.07855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882124.07865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.07927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.07950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.07970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.08053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.09722: stdout chunk (state=3): >>>/root <<< 7557 1726882124.09898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.09902: stdout chunk (state=3): >>><<< 7557 1726882124.09933: stderr chunk (state=3): >>><<< 7557 1726882124.10014: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882124.10041: _low_level_execute_command(): starting 7557 1726882124.10045: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592 `" && echo ansible-tmp-1726882124.100127-9484-100766232200592="` echo /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592 `" ) && sleep 0' 7557 1726882124.11245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882124.11249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882124.11252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882124.11263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882124.11266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882124.11269: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882124.11325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.11328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882124.11331: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 7557 1726882124.11334: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7557 1726882124.11464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.11468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.11470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.11475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.11533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.13488: stdout chunk (state=3): >>>ansible-tmp-1726882124.100127-9484-100766232200592=/root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592 <<< 7557 1726882124.13562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.13606: stdout chunk (state=3): >>><<< 7557 1726882124.13618: stderr chunk (state=3): >>><<< 7557 1726882124.13810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882124.100127-9484-100766232200592=/root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882124.13814: variable 'ansible_module_compression' from source: unknown 7557 1726882124.13816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882124.13946: variable 'ansible_facts' from source: unknown 7557 1726882124.14054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/AnsiballZ_command.py 7557 1726882124.14271: Sending initial data 7557 1726882124.14274: Sent initial data (153 bytes) 7557 1726882124.14804: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.14824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882124.14912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.14933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.14946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.15023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.16642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 7557 1726882124.16681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882124.16737: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpz9sam6zy /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/AnsiballZ_command.py <<< 7557 1726882124.16740: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/AnsiballZ_command.py" <<< 7557 1726882124.16844: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpz9sam6zy" to remote "/root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/AnsiballZ_command.py" <<< 7557 1726882124.17831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.17835: stderr chunk (state=3): >>><<< 7557 1726882124.17855: stdout chunk (state=3): >>><<< 7557 1726882124.17970: done transferring module to remote 7557 1726882124.17973: _low_level_execute_command(): starting 7557 1726882124.17976: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/ /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/AnsiballZ_command.py && sleep 0' 7557 1726882124.18568: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882124.18582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882124.18607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882124.18642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882124.18754: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.18772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.18849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.20608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.20619: stdout chunk (state=3): >>><<< 7557 1726882124.20633: stderr chunk (state=3): >>><<< 7557 1726882124.20661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882124.20751: _low_level_execute_command(): starting 7557 1726882124.20755: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/AnsiballZ_command.py && sleep 0' 7557 1726882124.21269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882124.21282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882124.21297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882124.21316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882124.21340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882124.21353: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882124.21367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.21412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.21481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.21501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.21521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.21684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.37559: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3204sec preferred_lft 3204sec\n inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:28:44.365016", "end": "2024-09-20 21:28:44.373545", "delta": "0:00:00.008529", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882124.38973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 7557 1726882124.38996: stdout chunk (state=3): >>><<< 7557 1726882124.39008: stderr chunk (state=3): >>><<< 7557 1726882124.39029: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3204sec preferred_lft 3204sec\n inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:28:44.365016", "end": "2024-09-20 21:28:44.373545", "delta": "0:00:00.008529", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882124.39080: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882124.39107: _low_level_execute_command(): starting 7557 1726882124.39118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882124.100127-9484-100766232200592/ > /dev/null 2>&1 && sleep 0' 7557 1726882124.39746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882124.39762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882124.39778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882124.39862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.39902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.39919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.39943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.40032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.41813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.41849: stderr chunk (state=3): >>><<< 7557 1726882124.41868: stdout chunk (state=3): >>><<< 7557 1726882124.42099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882124.42102: handler run complete 7557 1726882124.42105: Evaluated conditional (False): False 7557 1726882124.42107: attempt loop complete, returning result 7557 1726882124.42109: _execute() done 7557 1726882124.42111: dumping result to json 7557 1726882124.42113: done dumping result, returning 7557 1726882124.42115: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [12673a56-9f93-ed48-b3a5-000000001d93] 7557 1726882124.42117: sending task result for task 12673a56-9f93-ed48-b3a5-000000001d93 7557 1726882124.42198: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001d93 7557 1726882124.42202: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008529", "end": "2024-09-20 21:28:44.373545", "rc": 0, "start": "2024-09-20 21:28:44.365016" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3204sec preferred_lft 3204sec inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 7557 1726882124.42279: no more pending results, returning what we have 7557 1726882124.42285: results queue empty 7557 1726882124.42286: checking for any_errors_fatal 7557 1726882124.42287: done checking for any_errors_fatal 7557 1726882124.42288: checking for max_fail_percentage 7557 1726882124.42290: done checking for max_fail_percentage 7557 1726882124.42291: checking to see if all hosts have failed and the running result is not ok 7557 1726882124.42292: done checking to see if all hosts have failed 7557 1726882124.42295: getting the remaining hosts for this loop 7557 1726882124.42297: done getting the remaining hosts for this loop 7557 1726882124.42300: getting the next task for host managed_node3 7557 1726882124.42307: done getting next task for host managed_node3 7557 1726882124.42309: ^ task is: TASK: Verify DNS and network connectivity 7557 1726882124.42317: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882124.42325: getting variables 7557 1726882124.42327: in VariableManager get_vars() 7557 1726882124.42381: Calling all_inventory to load vars for managed_node3 7557 1726882124.42388: Calling groups_inventory to load vars for managed_node3 7557 1726882124.42391: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882124.42454: Calling all_plugins_play to load vars for managed_node3 7557 1726882124.42458: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882124.42462: Calling groups_plugins_play to load vars for managed_node3 7557 1726882124.44579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882124.46408: done with get_vars() 7557 1726882124.46435: done getting variables 7557 1726882124.46505: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:28:44 -0400 (0:00:00.418) 0:00:50.318 ****** 7557 1726882124.46537: entering _queue_task() for managed_node3/shell 7557 1726882124.46949: worker is 1 (out of 1 available) 7557 1726882124.46962: exiting _queue_task() for managed_node3/shell 7557 1726882124.46974: done queuing things up, now waiting for results queue to drain 7557 1726882124.46976: waiting for pending results... 7557 1726882124.47386: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 7557 1726882124.47409: in run() - task 12673a56-9f93-ed48-b3a5-000000001d94 7557 1726882124.47430: variable 'ansible_search_path' from source: unknown 7557 1726882124.47484: variable 'ansible_search_path' from source: unknown 7557 1726882124.47488: calling self._execute() 7557 1726882124.47590: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882124.47609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882124.47624: variable 'omit' from source: magic vars 7557 1726882124.48013: variable 'ansible_distribution_major_version' from source: facts 7557 1726882124.48037: Evaluated conditional (ansible_distribution_major_version != '6'): True 7557 1726882124.48202: variable 'ansible_facts' from source: unknown 7557 1726882124.48958: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 7557 1726882124.48972: variable 'omit' from source: magic vars 7557 1726882124.49022: variable 'omit' from source: magic vars 7557 1726882124.49055: variable 'omit' from source: magic vars 7557 1726882124.49111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7557 1726882124.49399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7557 1726882124.49402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7557 1726882124.49405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882124.49407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7557 1726882124.49409: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7557 1726882124.49411: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882124.49414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882124.49416: Set connection var ansible_module_compression to ZIP_DEFLATED 7557 1726882124.49418: Set connection var ansible_shell_executable to /bin/sh 7557 1726882124.49420: Set connection var ansible_shell_type to sh 7557 1726882124.49422: Set connection var ansible_pipelining to False 7557 1726882124.49424: Set connection var ansible_connection to ssh 7557 1726882124.49426: Set connection var ansible_timeout to 10 7557 1726882124.49428: variable 'ansible_shell_executable' from source: unknown 7557 1726882124.49430: variable 'ansible_connection' from source: unknown 7557 1726882124.49433: variable 'ansible_module_compression' from source: unknown 7557 1726882124.49435: variable 'ansible_shell_type' from source: unknown 7557 1726882124.49437: variable 'ansible_shell_executable' from source: unknown 7557 1726882124.49439: variable 'ansible_host' from source: host vars for 'managed_node3' 7557 1726882124.49441: variable 'ansible_pipelining' from source: unknown 7557 1726882124.49443: variable 'ansible_timeout' from source: unknown 7557 1726882124.49445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7557 1726882124.49596: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882124.49698: variable 'omit' from source: magic vars 7557 1726882124.49701: starting attempt loop 7557 1726882124.49704: running the handler 7557 1726882124.49781: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 7557 1726882124.49785: _low_level_execute_command(): starting 7557 1726882124.49787: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7557 1726882124.50879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882124.50899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882124.50915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882124.51000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.51038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.51053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.51074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.51154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.52789: stdout chunk (state=3): >>>/root <<< 7557 1726882124.52909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.52921: stdout chunk (state=3): >>><<< 7557 1726882124.52938: stderr chunk (state=3): >>><<< 7557 1726882124.52966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882124.53200: _low_level_execute_command(): starting 7557 1726882124.53204: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190 `" && echo ansible-tmp-1726882124.531069-9509-23420555767190="` echo /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190 `" ) && sleep 0' 7557 1726882124.54016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.54034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7557 1726882124.54123: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.54137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.54212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.56140: stdout chunk (state=3): >>>ansible-tmp-1726882124.531069-9509-23420555767190=/root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190 <<< 7557 1726882124.56352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.56355: stdout chunk (state=3): >>><<< 7557 1726882124.56362: stderr chunk (state=3): >>><<< 7557 1726882124.56380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882124.531069-9509-23420555767190=/root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882124.56422: variable 'ansible_module_compression' from source: unknown 7557 1726882124.56466: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7557ap94rh2e/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7557 1726882124.56552: variable 'ansible_facts' from source: unknown 7557 1726882124.56631: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/AnsiballZ_command.py 7557 1726882124.57310: Sending initial data 7557 1726882124.57320: Sent initial data (152 bytes) 7557 1726882124.58110: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882124.58116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882124.58135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882124.58142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7557 1726882124.58154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 7557 1726882124.58161: stderr chunk (state=3): >>>debug2: match not found <<< 7557 1726882124.58171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.58198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 7557 1726882124.58201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 7557 1726882124.58204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.58261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.58266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.58306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.60024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 7557 1726882124.60200: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpb9vtdsf_ /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/AnsiballZ_command.py <<< 7557 1726882124.60204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/AnsiballZ_command.py" <<< 7557 1726882124.60207: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-7557ap94rh2e/tmpb9vtdsf_" to remote "/root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/AnsiballZ_command.py" <<< 7557 1726882124.61206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.61249: stderr chunk (state=3): >>><<< 7557 1726882124.61252: stdout chunk (state=3): >>><<< 7557 1726882124.61289: done transferring module to remote 7557 1726882124.61303: _low_level_execute_command(): starting 7557 1726882124.61307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/ /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/AnsiballZ_command.py && sleep 0' 7557 1726882124.61926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7557 1726882124.61951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 7557 1726882124.61967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.61986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.62065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882124.63812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882124.63862: stderr chunk (state=3): >>><<< 7557 1726882124.63873: stdout chunk (state=3): >>><<< 7557 1726882124.63899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882124.63909: _low_level_execute_command(): starting 7557 1726882124.63912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/AnsiballZ_command.py && sleep 0' 7557 1726882124.64500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882124.64504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882124.64574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882125.11676: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3624 0 --:--:-- --:--:-- --:--:-- 3630\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1403 0 --:--:-- --:--:-- --:--:-- 1405", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:28:44.796910", "end": "2024-09-20 21:28:45.112942", "delta": "0:00:00.316032", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7557 1726882125.13303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882125.13307: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 7557 1726882125.13314: stderr chunk (state=3): >>><<< 7557 1726882125.13316: stdout chunk (state=3): >>><<< 7557 1726882125.13341: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3624 0 --:--:-- --:--:-- --:--:-- 3630\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1403 0 --:--:-- --:--:-- --:--:-- 1405", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:28:44.796910", "end": "2024-09-20 21:28:45.112942", "delta": "0:00:00.316032", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 7557 1726882125.13385: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7557 1726882125.13396: _low_level_execute_command(): starting 7557 1726882125.13428: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882124.531069-9509-23420555767190/ > /dev/null 2>&1 && sleep 0' 7557 1726882125.14684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7557 1726882125.14698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7557 1726882125.14818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7557 1726882125.14906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 7557 1726882125.15111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7557 1726882125.15185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7557 1726882125.17150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7557 1726882125.17154: stdout chunk (state=3): >>><<< 7557 1726882125.17161: stderr chunk (state=3): >>><<< 7557 1726882125.17181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7557 1726882125.17186: handler run complete 7557 1726882125.17214: Evaluated conditional (False): False 7557 1726882125.17234: attempt loop complete, returning result 7557 1726882125.17237: _execute() done 7557 1726882125.17239: dumping result to json 7557 1726882125.17244: done dumping result, returning 7557 1726882125.17253: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [12673a56-9f93-ed48-b3a5-000000001d94] 7557 1726882125.17258: sending task result for task 12673a56-9f93-ed48-b3a5-000000001d94 ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.316032", "end": "2024-09-20 21:28:45.112942", "rc": 0, "start": "2024-09-20 21:28:44.796910" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3624 0 --:--:-- --:--:-- --:--:-- 3630 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1403 0 --:--:-- --:--:-- --:--:-- 1405 7557 1726882125.17551: no more pending results, returning what we have 7557 1726882125.17556: results queue empty 7557 1726882125.17557: checking for any_errors_fatal 7557 1726882125.17569: done checking for any_errors_fatal 7557 1726882125.17569: checking for max_fail_percentage 7557 1726882125.17571: done checking for max_fail_percentage 7557 1726882125.17572: checking to see if all hosts have failed and the running result is not ok 7557 1726882125.17573: done checking to see if all hosts have failed 7557 1726882125.17574: getting the remaining hosts for this loop 7557 1726882125.17576: done getting the remaining hosts for this loop 7557 1726882125.17580: getting the next task for host managed_node3 7557 1726882125.17589: done getting next task for host managed_node3 7557 1726882125.17592: ^ task is: TASK: meta (flush_handlers) 7557 1726882125.17767: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882125.17783: getting variables 7557 1726882125.17785: in VariableManager get_vars() 7557 1726882125.17926: Calling all_inventory to load vars for managed_node3 7557 1726882125.17929: Calling groups_inventory to load vars for managed_node3 7557 1726882125.17931: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882125.17943: Calling all_plugins_play to load vars for managed_node3 7557 1726882125.17946: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882125.17950: Calling groups_plugins_play to load vars for managed_node3 7557 1726882125.18703: done sending task result for task 12673a56-9f93-ed48-b3a5-000000001d94 7557 1726882125.18707: WORKER PROCESS EXITING 7557 1726882125.20276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882125.23417: done with get_vars() 7557 1726882125.23449: done getting variables 7557 1726882125.23522: in VariableManager get_vars() 7557 1726882125.23546: Calling all_inventory to load vars for managed_node3 7557 1726882125.23548: Calling groups_inventory to load vars for managed_node3 7557 1726882125.23550: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882125.23556: Calling all_plugins_play to load vars for managed_node3 7557 1726882125.23558: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882125.23561: Calling groups_plugins_play to load vars for managed_node3 7557 1726882125.24827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882125.26328: done with get_vars() 7557 1726882125.26359: done queuing things up, now waiting for results queue to drain 7557 1726882125.26362: results queue empty 7557 1726882125.26362: checking for any_errors_fatal 7557 1726882125.26366: done checking for any_errors_fatal 7557 1726882125.26367: checking for max_fail_percentage 7557 1726882125.26369: done checking for max_fail_percentage 7557 1726882125.26369: checking to see if all hosts have failed and the running result is not ok 7557 1726882125.26370: done checking to see if all hosts have failed 7557 1726882125.26371: getting the remaining hosts for this loop 7557 1726882125.26372: done getting the remaining hosts for this loop 7557 1726882125.26374: getting the next task for host managed_node3 7557 1726882125.26378: done getting next task for host managed_node3 7557 1726882125.26380: ^ task is: TASK: meta (flush_handlers) 7557 1726882125.26381: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882125.26384: getting variables 7557 1726882125.26385: in VariableManager get_vars() 7557 1726882125.26407: Calling all_inventory to load vars for managed_node3 7557 1726882125.26409: Calling groups_inventory to load vars for managed_node3 7557 1726882125.26411: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882125.26416: Calling all_plugins_play to load vars for managed_node3 7557 1726882125.26418: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882125.26422: Calling groups_plugins_play to load vars for managed_node3 7557 1726882125.27542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882125.29116: done with get_vars() 7557 1726882125.29135: done getting variables 7557 1726882125.29184: in VariableManager get_vars() 7557 1726882125.29205: Calling all_inventory to load vars for managed_node3 7557 1726882125.29208: Calling groups_inventory to load vars for managed_node3 7557 1726882125.29210: Calling all_plugins_inventory to load vars for managed_node3 7557 1726882125.29215: Calling all_plugins_play to load vars for managed_node3 7557 1726882125.29217: Calling groups_plugins_inventory to load vars for managed_node3 7557 1726882125.29221: Calling groups_plugins_play to load vars for managed_node3 7557 1726882125.30280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7557 1726882125.32713: done with get_vars() 7557 1726882125.32749: done queuing things up, now waiting for results queue to drain 7557 1726882125.32751: results queue empty 7557 1726882125.32752: checking for any_errors_fatal 7557 1726882125.32753: done checking for any_errors_fatal 7557 1726882125.32754: checking for max_fail_percentage 7557 1726882125.32755: done checking for max_fail_percentage 7557 1726882125.32756: checking to see if all hosts have failed and the running result is not ok 7557 1726882125.32756: done checking to see if all hosts have failed 7557 1726882125.32757: getting the remaining hosts for this loop 7557 1726882125.32759: done getting the remaining hosts for this loop 7557 1726882125.32762: getting the next task for host managed_node3 7557 1726882125.32765: done getting next task for host managed_node3 7557 1726882125.32766: ^ task is: None 7557 1726882125.32768: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7557 1726882125.32769: done queuing things up, now waiting for results queue to drain 7557 1726882125.32770: results queue empty 7557 1726882125.32770: checking for any_errors_fatal 7557 1726882125.32771: done checking for any_errors_fatal 7557 1726882125.32771: checking for max_fail_percentage 7557 1726882125.32772: done checking for max_fail_percentage 7557 1726882125.32773: checking to see if all hosts have failed and the running result is not ok 7557 1726882125.32774: done checking to see if all hosts have failed 7557 1726882125.32776: getting the next task for host managed_node3 7557 1726882125.32779: done getting next task for host managed_node3 7557 1726882125.32780: ^ task is: None 7557 1726882125.32781: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=128 changed=4 unreachable=0 failed=0 skipped=118 rescued=0 ignored=0 Friday 20 September 2024 21:28:45 -0400 (0:00:00.863) 0:00:51.182 ****** =============================================================================== Install iproute --------------------------------------------------------- 3.52s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.71s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.69s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.44s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.31s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.21s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Create veth interface veth0 --------------------------------------------- 1.12s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Create veth interface veth0 --------------------------------------------- 1.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Verify DNS and network connectivity ------------------------------------- 0.86s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.83s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 0.77s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 0.76s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.74s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 0.74s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.73s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.72s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.71s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 7557 1726882125.32977: RUNNING CLEANUP